Skip to main content

AI in HR: How Automated Tools Create Pregnancy Discrimination Risk

AI hiring and performance tools can generate discriminatory outcomes for pregnant workers. Learn what employers need to know about AI in HR decisions.

April 7, 2026

HR manager reviewing AI hiring tool output on computer screen

AI tools are showing up in every part of the hiring and management process. Resume screeners, scheduling algorithms, performance scoring systems, and attendance trackers all promise to make HR decisions faster and more consistent. But faster does not mean legally safe.

If your business uses automated tools for hiring, scheduling, or performance reviews, you need to understand how those tools can create pregnancy discrimination liability, even when nobody intended to discriminate. AI HR tools create legal risk that many employers overlook.

The fix starts with auditing how your tools handle pregnancy-related absences and building human review into every adverse employment decision. Here is what employers need to know.

The Problem with Automated HR Decisions

Most AI-powered HR tools work by identifying patterns in data. A resume screener learns what a "successful" candidate looks like based on past hires. A scheduling algorithm optimizes for availability and consistency. A performance tracker flags employees whose output drops or whose attendance changes.

These tools do not know why a pattern exists. They just act on it.

An attendance algorithm that flags employees who miss more than a set number of days will flag pregnant workers who attend prenatal appointments. None of these outcomes require a manager to make a biased decision. The tool makes it for them.

That becomes a legal problem when a tool penalizes workers for pregnancy-related absences, medical appointments, or temporary changes in availability. This is a form of AI hiring discrimination, even though no human made a biased decision. An attendance algorithm that flags employees who miss more than a set number of days will flag pregnant workers who attend prenatal appointments. A performance scoring tool that weights "consistency" or "availability" will downgrade workers who take intermittent leave. A scheduling tool that deprioritizes employees who have declined shifts will sideline pregnant workers who requested accommodations.

None of these outcomes require a manager to make a biased decision. The tool makes it for them.

What the Law Says

Federal law prohibits pregnancy discrimination in employment under Title VII of the Civil Rights Act and the Pregnant Workers Fairness Act (PWFA). The PWFA, which took effect in June 2023, requires employers with 15 or more employees to provide reasonable accommodations for known limitations related to pregnancy, childbirth, or related medical conditions. It also prohibits employers from taking adverse action against workers for requesting or using those accommodations.

These federal requirements apply to employers with 15 or more employees. Some states go further. Iowa's Civil Rights Act, for example, covers employers with four or more employees, extending pregnancy discrimination protections beyond the federal threshold.

For a deeper look at PWFA requirements, read our guide: Pregnant Workers Fairness Act Employer Guide.

Where AI Tools Create Risk

Resume Screening and Hiring

AI resume screeners trained on historical hiring data can develop biases against candidates with employment gaps. Gaps caused by maternity leave or pregnancy-related medical leave look the same to the algorithm as any other gap. If the tool filters out candidates with gaps, automated hiring pregnancy discrimination is the result, even if the employer never saw the filtered resumes.

Scheduling and Shift Assignment

Automated scheduling tools that optimize for "reliability" or "availability" can penalize employees who have requested schedule modifications as a pregnancy accommodation. If the tool deprioritizes those employees for desirable shifts, that is an adverse employment action.

Performance Reviews and Productivity Tracking

Performance algorithms that measure output over fixed periods without accounting for approved leave or accommodations will produce lower scores for employees who took pregnancy-related time off. If those scores feed into promotion, raise, or termination decisions, the tool has created a discriminatory outcome.

Attendance and Leave Management

Automated attendance systems that assign "points" for absences or late arrivals can accumulate penalties for employees attending prenatal care appointments or experiencing pregnancy-related health issues. A points-based system that assigns one point per absence and triggers a written warning at six points will likely reach that threshold for an employee attending biweekly prenatal appointments within three months. If the system triggers disciplinary action at that threshold, it can push a pregnant employee toward termination without any human review of the underlying reasons.

What Employers Should Do

You do not need to stop using AI tools in HR. But you do need to understand what they are doing and build safeguards around them.

  • Audit your tools. Know what data your AI systems use, what factors they weight, and how they handle employees on leave or with accommodations. If the vendor cannot explain this clearly, that is a red flag.
  • Build in human review. Automated recommendations should be reviewed by a person before they result in adverse actions like termination, demotion, or denial of a promotion. This is especially important for any decision affecting an employee who has requested a pregnancy-related accommodation.
  • Exclude protected leave from negative metrics. Configure your tools to exclude approved leave, FMLA time, and pregnancy accommodations from attendance scores, performance calculations, and scheduling algorithms.
  • Document your process. Keep records showing how your AI tools work, what safeguards you have in place, and how accommodation requests are handled within those systems.
  • Train your managers. The people who act on AI-generated recommendations need to understand that an algorithm's output is not a final decision. They need to know when to pause and check whether a flagged employee is on protected leave or has an active accommodation.

Create an AI Use Policy for HR

If your business uses AI tools in any part of the employment process, you should have a written policy that covers how those tools are selected, what decisions they inform, how accommodations and leave are handled within them, and who reviews automated recommendations before they become final actions.

This policy does not need to be long. It needs to be specific enough that your managers know what to check and your documentation shows you took the issue seriously. If a discrimination claim arises, having a clear, written policy and evidence that you followed it makes a significant difference. This applies to employers of all sizes, given the federal protections now in effect.

Surge Business Law helps employers draft AI use policies for HR functions, review existing tools for compliance risks, and build accommodation workflows that work alongside automated systems. Our Momentum Membership gives you unlimited email access to our business attorneys for $95/month, so you can ask these questions as they come up.

The Bottom Line for Employers

AI tools can make HR processes more efficient. They can also generate liability if they are not configured to account for pregnancy accommodations, protected leave, and the legal requirements that apply to your business. The employer is always responsible for the outcome, regardless of whether a person or a machine made the decision.

Review your tools. Build in safeguards. Get a policy in place before a problem surfaces.

Schedule a free consultation to discuss your AI tools and employment policies, or learn more about how our business law services support employers navigating these issues.