Photo of Grace Howard

Grace Howard

Grace Howard is an associate in the firm’s Washington, DC office. She represents and advises clients on a range of cybersecurity, data privacy, and government contracts issues including cyber and data security incident response and preparedness, regulatory compliance, and internal investigations including matters involving allegations of noncompliance with U.S. government cybersecurity regulations and fraud under the False Claims Act.

Prior to joining the firm, Grace served in the United States Navy as a Surface Warfare Officer and currently serves in the U.S. Navy Reserve.

This update highlights key mid-year legislative and regulatory developments and builds on our first quarter update related to artificial intelligence (“AI”), connected and automated vehicles (“CAVs”), Internet of Things (“IoT”), and cryptocurrencies and blockchain developments.

I. Federal AI Legislative Developments

    In the first session of the 119th Congress, lawmakers rejected a proposed moratorium on state and local enforcement of AI laws and advanced several AI legislative proposals focused on deepfake-related harms.  Specifically, on July 1, after weeks of negotiations, the Senate voted 99-1 to strike a proposed 10-year moratorium on state and local enforcement of AI laws from the budget reconciliation package, the One Big Beautiful Bill Act (H.R. 1), which President Trump signed into law.  The vote to strike the moratorium follows the collapse of an agreement on revised language that would have shortened the moratorium to 5 years and allowed states to enforce “generally applicable laws,” including child online safety, digital replica, and CSAM laws, that do not have an “undue or disproportionate effect” on AI.  Congress could technically still consider the moratorium during this session, but the chances of that happening are low based on both the political atmosphere and the lack of a must-pass legislative vehicle in which it could be included.  See our blog post on this topic for more information.

    Additionally, lawmakers continue to focus legislation on deepfakes and intimate imagery.  For example, on May 19, President Trump signed the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (“TAKE IT DOWN”) Act (H.R. 633 / S. 146) into law, which requires online platforms to establish a notice and takedown process for nonconsensual intimate visual depictions, including certain depictions created using AI.  See our blog post on this topic for more information.  Meanwhile, members of Congress continued to pursue additional legislation to address deepfake-related harms, such as the STOP CSAM Act of 2025 (S. 1829 / H.R. 3921) and the Disrupt Explicit Forged Images And Non-Consensual Edits (“DEFIANCE”) Act (H.R. 3562 / S. 1837).Continue Reading U.S. Tech Legislative & Regulatory Update – 2025 Mid-Year Update