March in State, Federal, and International Cybersecurity, AI Law


Welcome to this month’s issue of The BR Privacy, Security & AI Download, the digital newsletter of Blank Rome’s Privacy, Security & Data Protection practice. 

California AI Rules for Lawyers and Arbitrators Pass Senate, Head to Assembly 

Blank Rome vice chair of artificial intelligence Sharon R. Klein and partners Mark S. Adams, Joseph J. Mellema, and Alex C. Nisenbaum authored this alert discussing the California Senate‑approved SB 574, a bill that would establish new AI guardrails for attorneys and arbitrators.

State & Local Laws & Regulations

CalPrivacy Sponsors Whistleblower Protection Bill: Assemblymember Pilar Schiavo introduced AB 2021, sponsored by the California Privacy Protection Agency (“CalPrivacy”), to establish whistleblower protections under the California Consumer Privacy Act (“CCPA”). The legislation, known as the Whistleblower Protection and Privacy Act, supports whistleblowers through an award program to incentivize individuals to report potential violations and anti-retaliation provisions to protect them after coming forward. The bill addresses a key challenge in privacy enforcement, that technology and data-driven companies often have complex business practices intentionally hidden from public view, making it time-consuming for regulators to uncover potential violations. Under AB 2021, whistleblowers may share in a portion of enforcement awards, and at the agency’s discretion, their attorneys may collaborate on the enforcement action. The legislation also prohibits employer retaliation against individuals who report violations. 

Bill to Expand Data Deletion Rights Introduced in California Senate: California Senate Bill 923, introduced by Senator Josh Becker, proposes significant amendments to the CCPA that would expand consumer data deletion rights. The bill seeks to address a gap in current privacy protections by requiring businesses to delete not only personal information collected directly from consumers, but also data obtained from data brokers and other third parties. This includes demographic data and purchasing histories often used for targeted advertising, pricing decisions, and profiling. The bill also enhances accessibility requirements for privacy requests. Businesses operating exclusively online would be required to provide consumers with both an e-mail address and an online method, such as a web form or portal, to submit access, correction, and deletion requests. 

California Senate Passes Bill to Regulate Attorney Use of AI: California Senate Bill 574, authored by Senator Tom Umberg, passed the Senate unanimously and has been sent to the Assembly for consideration. The bill establishes artificial intelligence (“AI”) guardrails for lawyers and arbitrators to address risks associated with generative AI in legal practice, including confidentiality breaches and AI “hallucinations” that produce inaccurate citations or fabricated facts. The bill creates an explicit attorney duty requiring lawyers to prevent confidential or nonpublic information from being entered into public AI systems and to take reasonable steps to verify AI-generated content for accuracy, bias, and harmful material. The bill also makes citation verification nondelegable, requiring attorneys to personally read and verify all citations in court filings regardless of whether AI assisted in drafting. Finally, the bill prohibits arbitrators from delegating any decision-making to AI and directs them to avoid AI use that could influence procedural or substantive outcomes. If enacted, California would become one of the first states to statutorily regulate AI use by legal professionals.

Connecticut AG Releases 2025 Connecticut Data Privacy Act Enforcement Report: Connecticut Attorney General (“AG”) William Tong released the third annual report on the Connecticut Data Privacy Act (“CTDPA”), marking the first report to reflect enforcement of expanded minors’ privacy protections that became effective on October 1, 2024. The report discloses active investigations into online platforms affecting children and teens, including messaging apps, gaming platforms, and AI chatbots. Key enforcement activities in 2025 included investigations into connected vehicles and geolocation data, social media platforms used by minors, gaming apps potentially using children’s data for targeted advertising, and data brokers. The AG’s Office issued dozens of violation notices, finalized multiple data breach settlements, and resolved its first CTDPA enforcement action. The report recommends legislative action to narrow public information exemptions, adopt genetic data privacy protections, enact AI and chatbot safeguards, and expand universal opt-out provisions. 

Maine House of Representatives Advances Data Privacy Law: The Maine House of Representatives advanced the Maine Online Data Privacy Act to the Senate chamber. The current legislative proposal aligns closely with Maryland’s comprehensive data privacy law, which contains stringent data minimization requirements, enhanced children’s privacy protections, and prohibitions on sensitive data sales.

Virginia AG Announces Enforcement of Social Media Time Limits for Minors: Virginia Attorney General Jay Jones announced that his office intends to fully enforce new provisions of the Virginia Consumer Data Protection Act requiring social media platforms to limit minors’ daily usage. Effective on January 1, 2026, the Virginia law requires social media platforms to employ commercially reasonable methods, such as neutral age screen mechanisms, to determine whether a user is a minor under 16 years of age and to limit minors’ use of social media platforms to one hour per day, per service or application. Verifiable parental consent is required to increase or decrease the daily time limit. The announcement follows the Attorney General’s filing of a motion to dismiss a lawsuit brought by NetChoice, a trade association representing social media companies, seeking to block enforcement of Virginia’s law. Under the enforcement framework, the Attorney General’s office will communicate evidence of non-compliance directly to companies and provide 30 days to cure violations as required by law. Companies that fail to remedy violations face enforcement actions that could result in civil penalties of up to $7,500 per violation, as well as injunctive relief. 

Oklahoma Legislature Advances Comprehensive Privacy Law: After seven years of legislative efforts, Oklahoma is poised to join the growing network of states with comprehensive consumer privacy legislation. The Oklahoma House approved Senate Bill 546 (the “bill” or “SB 546”) on an 84-4 vote. The bill now requires Senate concurrence, with sponsors expressing confidence that Governor Kevin Stitt will sign it into law. The bill’s framework aligns closely with Virginia’s Consumer Data Protection Act, which has become the template for a majority of the 19 enacted state comprehensive privacy laws. SB 546 applies to businesses that control or process the personal data of at least 100,000 Oklahoma residents, or the data of at least 25,000 consumers while deriving 50 percent or more of gross revenue from data sales. The legislation provides standard data subject access rights, including opt-outs for targeted advertising and data sales, and requires data protection assessments for certain processing activities. If enacted, the law will take effect on January 1, 2027, with enforcement exclusively through the attorney general and a permanent 30-day cure provision. Notably, the bill omits provisions found in other recent state laws, such as universal opt-out mechanism recognition and enhanced children’s privacy protections. 

Federal Laws & Regulations

FTC Issues Policy Statement on COPPA Enforcement to Promote Age-Verification Technology: The Federal Trade Commission (“FTC”) issued a policy statement announcing that it will not bring enforcement actions under the Children’s Online Privacy Protection Rule (“COPPA Rule”) against operators of general audience and mixed-audience websites or online services that collect, use, or disclose personal information of children for the sole purpose of determining a user’s age via age-verification technologies, even without first obtaining verifiable parental consent. The policy reflects the FTC’s recognition that age verification plays a critical role in helping parents monitor their children’s online activities. To benefit from this enforcement discretion, operators must: use collected information solely for age verification purposes; retain data only as long as necessary and delete it promptly thereafter; disclose data only to third parties capable of maintaining its confidentiality, security, and integrity; provide clear notice to parents and children; employ reasonable security safeguards; and take reasonable steps to ensure their verification methods provide reasonably accurate results. The FTC also indicated it intends to initiate a formal review of the COPPA Rule to address age-verification mechanisms, with the policy statement remaining effective until final rule amendments are published or the statement is otherwise withdrawn.

NIST Launches AI Agent Standards Initiative: The Center for AI Standards and Innovation at the National Institute of Standards and Technology (“NIST”) announced the launch of the AI Agent Standards Initiative (the “Initiative”), a new federal effort to promote the secure and interoperable development of autonomous AI agents. AI agents are capable of performing tasks such as writing and debugging code, managing e-mails and calendars, and making purchases on behalf of users. NIST identified that without reliable industry standards and interoperability protocols, this emerging technology risks ecosystem fragmentation and constrained adoption. The Initiative will advance along three pillars: (1) facilitating industry-led development of agent standards and United States leadership in international standards bodies; (2) fostering community-led open source protocol development and maintenance for agents; and (3) advancing research in AI agent security and identity to enable new use cases and promote trusted adoption across economic sectors. 

State AGs Urge Congress to Pass Senate Version of Kids Online Safety Act: Forty state attorneys general sent a letter to Congressional leaders urging passage of the Senate version of the Kids Online Safety Act (“KOSA”), while expressing opposition to the House’s counterpart bill. The letter states that social media platforms deliberately target minors and are intentionally designed to be addictive, generating substantial profits by monetizing minors’ personal data through targeted advertising and failing to adequately disclose harms associated with excessive use. The state attorneys general raise two primary objections to the House version. First, they assert that the House bill’s expansive preemption language would limit states’ ability to address evolving online harms, undermining pioneering state laws promoting online safety for minors. Second, they criticize the House bill’s omission of a “duty of care” requirement, a key component of the Senate bill that would impose a legal obligation on platforms to prevent harm, in favor of merely requiring “reasonable policies, practices, and procedures” addressing a limited list of harms. According to the state attorneys general, many companies already maintain such policies, which have proven ineffective at protecting minors. 

US Litigation

Fifth Circuit Holds Written Consent Sufficient for Prerecorded Telemarketing Calls: The Fifth Circuit held in Bradford v. Sovereign Pest Control of TX, Inc., that oral consent is sufficient to satisfy the “prior express consent” requirement under the Telephone Consumer Protection Act (“TCPA”) for prerecorded calls, including telemarketing calls. The ruling overturns more than a decade of FCC precedent requiring prior express written consent for telemarketing robocalls. The case arose when plaintiff Radley Bradford sued Sovereign Pest Control, alleging the company’s automated calls reminding him to schedule inspections and renew his service plan violated the TCPA because he never provided written consent. The Fifth Circuit disagreed, finding that Bradford had provided valid consent when he voluntarily gave his cell phone number during the service agreement, never limited the scope of calls, and never objected to receiving them. Interpreting the TCPA’s text without deference to FCC regulations, consistent with the Supreme Court’s decisions in McLaughlin Chiropractic Associates, Inc. v. McKesson Corp. and Loper Bright Enterprises v. Raimondo, the Fifth Circuit concluded that “express consent” at the time Congress enacted the TCPA encompassed consent given either orally or in writing. For a detailed analysis of this decision and its implications for businesses, see our Client Alert on this case. 

Job Applicants File Lawsuit Alleging AI Screening Tool Violates Consumer Reporting Laws: Two job applicants filed suit against Eightfold AI (“Eightfold”) alleging that the company’s AI-powered talent evaluation platform violates the federal Fair Credit Reporting Act (“FCRA”) and California’s Investigative Consumer Reporting Agencies Act by functioning as a consumer reporting agency without complying with the disclosure, authorization, and notice requirements these laws impose. According to the complaint, Eightfold’s platform allegedly gathers information about job applicants from third-party sources, including LinkedIn, GitHub, and Stack Overflow; analyzes data from more than 1.5 billion global data points; creates inferences about applicants’ characteristics, behavior, and abilities; and ranks candidates on a scale predicting their likelihood of success. The plaintiffs claim they were never informed that a consumer report would be created, never authorized its creation, and never had an opportunity to review or dispute the information. The lawsuit signals a new litigation theory distinct from prior discrimination-focused challenges to AI hiring tools. If successful, the case could require AI screening vendors and employers using such tools to implement full FCRA compliance programs, including standalone disclosures, written authorizations, and pre-adverse action notices

Court Holds Cyber Policy May Extend to Losses Incurred as a Result of Airline Computer Outage: The U.S. District Court for the Northern District of Texas issued a ruling in Southwest Airlines Co. v. Liberty Insurance Underwriters Inc., a case arising from a 2016 systemwide computer failure that disrupted flights and affected approximately 475,000 customers. At issue was Southwest’s cyber risk insurance policy, which provided “System Failure Coverage” obligating the insurer to pay “all loss” incurred “solely as a result of a System Failure.” The policy defined recoverable loss to include costs that “would not have been incurred but for a Material Interruption.” Southwest claimed over $77 million in losses and sought $10 million in coverage from Liberty, the fifth-layer excess insurer. The Court granted Southwest’s motion for partial summary judgment, concluding that the phrase “but for” in the policy’s loss definition means “except for” or “if it were not for,” consistent with standard but-for causation principles under Texas law. According to the filing, the policy defines loss to include “costs that would not have been incurred but for a material interruption,” and “costs to reroute or reschedule passenger travel that would not have been incurred but for a material interruption.” The Court stated that its finding does not “conclude or opine with respect to whether the insurance policy provides coverage for the disputed expenses, or any part of them.” The case proceeds with remaining coverage and damages questions.

Lawsuit Accuses Computer Manufacturer of Sharing Data with Chinese Parent Company in Violation of DOJ Bulk Data Transfer Rule: A proposed class action filed in California federal court accuses Lenovo (United States) Inc. of unlawfully sharing American consumers’ personal data with its Chinese parent, Lenovo Group, in violation of the Department of Justice’s Bulk Data Transfer Rule. The complaint alleges that Lenovo’s website deploys approximately 55 tracking technologies, including web beacons, pixels, cookies, APIs, and software development kits from third parties such as TikTok, Facebook, Google, and Microsoft, to collect information from over 100,000 U.S. consumers. According to the suit, the data collected in bulk includes IP addresses, cookie data, and URLs revealing browsing behavior, which can be combined with other datasets to create comprehensive profiles reflecting consumers’ behavior, preferences, and demographics. The plaintiff asserts that Lenovo does not provide users with the opportunity to consent to or opt out of these tracking technologies. The complaint warns that Lenovo’s practices “create undue risks to national security and to the privacy of U.S. persons” by enabling the development of detailed dossiers that could be used to target individuals in sensitive roles. 

U.S. Enforcement

California Attorney General Issues Larges CCPA Fine Ever: The Office of the California Attorney General announced it had entered into a settlement with the Walt Disney Company (“Disney”) to resolve allegations that Disney failed to fully effectuate consumers’ requests to opt out of the sale or sharing of their personal data across all devices and streaming services associated with their accounts. The enforcement action stems from a January 2024 investigative sweep of streaming services for potential CCPA violations. According to the California Attorney General, the investigation revealed significant gaps in Disney’s opt-out mechanisms. Specifically, opt-out toggles applied only to the specific streaming service and device being used, rather than all services and devices linked to a consumer’s account. Webform opt-outs stopped sharing only through Disney’s own advertising platform while allowing continued data sharing with embedded third-party ad-tech companies. Additionally, Disney honored Global Privacy Control (“GPC”) signals only for the specific device used to make the request, even when consumers were logged into their accounts. AG Bonta emphasized that businesses cannot require consumers to submit separate opt-out requests device-by-device or service-by-service. Under the settlement, Disney must implement opt-out methods that fully stop the sale or sharing of consumers’ personal information. 

CalPrivacy Finalizes $1.1 Million Settlement with Youth Sports Media Company: The California Privacy Protection Agency announced a final settlement with 2080 Media, Inc., d/b/a PlayOn Sports (“PlayOn”), a youth sports media company, resolving alleged violations of the California Consumer Privacy Act. PlayOn operates GoFan, MaxPreps, and the NFHS Network, providing digital ticketing, streaming, and related services to high schools and youth sports organizations nationwide, including approximately 1,400 California schools. CalPrivacy’s investigation found that PlayOn collected personal information through first- and third-party cookies, pixel tags, and similar tracking technologies for targeted advertising, which constituted the sale and sharing of personal information under the CCPA. However, PlayOn failed to provide consumers with an effective method to opt out of such sale and sharing and did not honor opt-out preference signals. CalPrivacy also cited deficient privacy notices, including a privacy policy that had not been updated for over 18 months and failed to inform consumers of their right to opt out of sharing. Under the settlement, PlayOn must pay a $1.1 million administrative fine, conduct privacy risk assessments, maintain contracts with third parties receiving personal information, properly honor opt-out preference signals, and ensure notices are understandable to its audience, including minors attending high school events. This enforcement action underscores CalPrivacy’s continued focus on tracking technologies and the importance of providing meaningful opt-out mechanisms for consumers.

Florida AG Establishes Unit Focused on Foreign Adversaries Threats to Privacy and Economic Security: Florida Attorney General James Uthmeier has established the Consumer Harm from International Nefarious Actors (“CHINA”) Prevention Unit, a dedicated section within the Office of the Attorney General focused on combating threats from the Chinese Communist Party and other foreign adversaries to Florida consumers, data privacy, and economic security. The unit’s core focus is prevention through investigations into companies with ties to foreign adversaries, particularly those involving data practices. The Attorney General has already issued subpoenas to several companies over alleged data privacy and security concerns. This initiative represents a significant state-level enforcement action targeting foreign companies’ data collection and handling practices and signals heightened scrutiny of supply chain and data security risks associated with Chinese-manufactured consumer and medical devices operating in Florida.

FTC Reminds Data Brokers of Their Obligations to Comply with PADFAA: The FTC sent warning letters to 13 data brokers emphasizing their compliance obligations under the Protecting Americans’ Data from Foreign Adversaries Act of 2024 (“PADFAA”). PADFAA, which took effect on June 24, 2024, prohibits data brokers from selling, licensing, renting, trading, transferring, releasing, disclosing, providing access to, or otherwise making available personally identifiable sensitive data of U.S. individuals to any foreign adversary country, currently including China, Iran, North Korea, and Russia, or any entity controlled by a foreign adversary. The FTC has enforcement authority under Section 5 of the FTC Act and treats violations of PADFAA as unfair or deceptive practices. The term “sensitive data” under PADFAA encompasses government-issued identifiers (such as Social Security numbers), health data, financial account details, biometric information, genetic information, precise geolocation, private communications, account or device log-in credentials, information about individuals under age 17, information revealing Armed Forces member status, and various other demographic and behavioral data. The FTC’s letters specifically noted that the agency identified instances where some recipients have offered solutions and insights involving the status of individuals as members of the Armed Forces. The letters warn that violations may result in FTC enforcement actions with civil penalties of up to $53,088 per violation. 

Texas AG Sues Social Media Company for Deceptive Practices and Child Safety Violations: Texas Attorney General Ken Paxton announced his office filed a lawsuit against Snap, Inc. (“Snapchat”) alleging the platform deceived parents and consumers about the safety of its application and exposed children to harmful content and addictive features. The lawsuit claims Snapchat knowingly misrepresented its platform as safe for children by promoting “12+” age ratings on app stores while simultaneously exposing users to dangerous and mature content, including profanity, sexual content, nudity, and drug use. Central to the complaint are allegations that app features such as “Snapstreaks” incentivize daily use and harm young users due to their addictive nature. AG Paxton emphasized that parents have a fundamental right to understand the dangers of applications their children use and should not be misled by technology companies. This action is part of AG Paxton’s ongoing enforcement efforts targeting major technology and social media companies over child safety concerns, following similar lawsuits filed against TikTok and Roblox. The case aligns with a broader national trend of state attorneys general pursuing legal action against platforms for allegedly designing features that purposefully addict youth and for failing to adequately protect minor users. 

Los Angeles County Sues Gaming Platform over Child Safety Failures: Los Angeles County filed a civil lawsuit against Roblox Corporation alleging the company falsely marketed its online gaming platform as safe for children while knowingly allowing it to become a “hunting ground for predators.” The complaint alleges that Roblox’s default settings, including unverified accounts, no age verification, and open communication channels, repeatedly exposed minors to sexual exploitation. Until 2024, the platform allegedly allowed anyone to message or send friend requests to anyone else without requiring accounts to be tied to a phone number, e-mail address, or government ID. The county also alleges that Roblox hosted sexually explicit content and that thousands of users shared child sexual abuse material on the platform. Los Angeles seeks damages under California’s False Advertising Law and Unfair Competition Law, civil penalties of up to $2,500 per violation, and a court order requiring Roblox to implement “meaningful safeguards that materially reduce foreseeable harm to minors.” Roblox disputes the claims and maintains that “safety is a constant and consistent focus” of its work. 

Texas AG Settles ACR Data Collection Lawsuit with TV Manufacturer: Samsung Electronics America Inc. (“Samsung”) agreed to resolve allegations brought by the Texas Attorney General that the company violated the Texas Deceptive Trade Practices Act by failing to adequately disclose the automatic content recognition (“ACR”) technology embedded in its Smart TVs. Paxton alleged that Samsung’s ACR technology captured audio and visual data of what viewers watched every 500 milliseconds without consumers’ knowledge or meaningful consent. The settlement requires Samsung to halt collection or processing of ACR viewing data without first obtaining Texas consumers’ express consent and to implement “clear and conspicuous” disclosures and consent screens enabling consumers to make informed decisions about data collection. The settlement follows a brief period in January 2026 during which a Texas state court issued, then promptly vacated, a temporary restraining order blocking Samsung’s ACR data collection. The Samsung lawsuit is one of five similar actions Paxton filed against major smart TV manufacturers, including Sony, LG, Hisense, and TCL, alleging unlawful consumer surveillance through ACR technology. Litigation against the remaining manufacturers continues.

International Laws & Regulations

European Commission Misses AI Act Deadline for High-Risk AI Guidance: The European Commission missed the February 2, 2026, deadline to publish critical guidelines on high-risk AI systems under the EU AI Act, marking the second delay for this anticipated guidance. The guidelines are intended to clarify which AI systems qualify as “high-risk” and therefore face more stringent compliance obligations, including documentation requirements and post-market monitoring plans. A Commission spokesperson stated the guidance is “currently subject to a revised timeline” with publication expected “at a later date” to incorporate stakeholder feedback and respect internal procedures. This delay compounds broader implementation challenges facing the AI Act, including missing technical standards, delayed codes of practice for general-purpose AI, and member states’ failure to formally designate national enforcement bodies. Two EU standardization bodies, the European Electrotechnical Committee for Standardization and the European Committee for Standardization, missed their fall 2025 deadline to produce necessary technical standards and are now targeting late 2026. The Commission’s Digital Omnibus package proposes delaying high-risk AI rules by over a year beyond the current effective date of August 2026, giving companies additional compliance time. However, critics warn this creates uncertainty and undermines confidence in the regulatory framework.

EDPB, EDPS Issue Joint Opinion on Digital Omnibus Proposal: The European Data Protection Board (“EDPB”) and European Data Protection Supervisor (“EDPS”) issued a joint opinion on the European Commission’s Digital Omnibus proposal to amend the General Data Protection Regulation (“GDPR”), ePrivacy Directive, and other EU digital legislation. While the EDPB and EDPS support the proposal’s objectives to simplify compliance, reduce administrative burden, and foster EU competitiveness, they strongly urge co-legislators to reject the proposed changes to the definition of “personal data” under the GDPR, arguing such changes would narrow the concept of personal data and adversely affect fundamental data protection rights. The EDPB and EDPS expressed concern that the Commission’s proposed amendment, which would specify when information is not personal data for a given entity, goes beyond a technical amendment and selectively codifies CJEU jurisprudence in a manner that increases legal uncertainty. They particularly objected to delegating authority to the Commission to determine through implementing acts what is no longer personal data after pseudonymization. The EDPB and EDPS welcomed proposals to extend data breach notification deadlines from 72 to 96 hours, to address cookie consent fatigue through automated machine-readable consent signals, and to create common templates for data breach notifications and data protection impact assessments. A leaked draft of compromise text circulated in the EU Council appears to eliminate the Commission’s revised “personal data” definition entirely. 

U.S Enters Into Trade Agreements with Argentina and Bangladesh that Include Free Cross-Border Data Transfers: The U.S announced that it had signed an agreement with Argentina on Reciprocal Trade and Investment. Among other things, the trade agreement with Argentina will allow Argentina to deem the United States as adequate for cross-border data transfers and includes a commitment to prevent the restriction of U.S. technology innovation. Separately, the United States announced it had signed an agreement with Bangladesh on Reciprocal Trade pursuant to which the countries agreed to “permit the free transfer of data across trusted borders.”

CJEU Clarifies When Personal Data Is Collected from the Data Subject: The Court of Justice of the European Union (“CJEU”) issued a ruling clarifying that personal data obtained through direct observation or monitoring of an individual’s activity is considered “collected from the data subject” under Article 13 of the GDPR. The case arose from a Swedish public transportation authority’s use of body-worn cameras by ticket inspectors, which Sweden’s data protection authority found violated transparency requirements. The CJEU’s ruling has important practical implications for data controllers employing surveillance technologies such as cameras, drones, smart devices, and wearables. These controllers must provide data subjects with the transparency disclosures required under GDPR Article 13 rather than Article 14, which applies when data is obtained from other sources. The CJEU confirmed that layered notice approaches, such as information signage with additional details available on a website, satisfy transparency requirements in these contexts.

Recent Publications & Media Coverage

Blank Rome partners Harrison Brown, Jeffrey N. Rosenthal, and of counsel Thomas P. Cialino authored this alert discussing the Fifth Circuit’s consideration of the appeal in Bradford v. Sovereign Pest Control of TX, Inc., and how this may usher in rapid change for Telephone Consumer Protection Act litigation.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *