Photo of Nicholas Xenakis

Nicholas Xenakis

Nick Xenakis draws on his Capitol Hill experience to provide regulatory and legislative advice to clients in a range of industries, including technology. He has particular expertise in matters involving the Judiciary Committees, such as intellectual property, antitrust, national security, immigration, and criminal justice.

Nick joined the firm’s Public Policy practice after serving most recently as Chief Counsel for Senator Dianne Feinstein (C-DA) and Staff Director of the Senate Judiciary Committee’s Human Rights and the Law Subcommittee, where he was responsible for managing the subcommittee and Senator Feinstein’s Judiciary staff. He also advised the Senator on all nominations, legislation, and oversight matters before the committee.

Previously, Nick was the General Counsel for the Senate Judiciary Committee, where he managed committee staff and directed legislative and policy efforts on all issues in the Committee’s jurisdiction. He also participated in key judicial and Cabinet confirmations, including of an Attorney General and two Supreme Court Justices. Nick was also responsible for managing a broad range of committee equities in larger legislation, including appropriations, COVID-relief packages, and the National Defense Authorization Act.

Before his time on Capitol Hill, Nick served as an attorney with the Federal Public Defender’s Office for the Eastern District of Virginia. There he represented indigent clients charged with misdemeanor, felony, and capital offenses in federal court throughout all stages of litigation, including trial and appeal. He also coordinated district-wide habeas litigation following the Supreme Court’s decision in Johnson v. United States (invalidating the residual clause of the Armed Career Criminal Act).

The field of artificial intelligence (“AI”) is at a tipping point. Governments and industries are under increasing pressure to forecast and guide the evolution of a technology that promises to transform our economies and societies. In this series, our lawyers and advisors provide an overview of the policy approaches and regulatory frameworks for AI in jurisdictions around the world. Given the rapid pace of technological and policy developments in this area, the articles in this series should be viewed as snapshots in time, reflecting the current policy environment and priorities in each jurisdiction.

The following article examines the state of play in AI policy and regulation in the United States. The previous article in this series covered the European Union.

Future of AI Policy in the U.S.

U.S. policymakers are focused on artificial intelligence (AI) platforms as they explode into the mainstream.  AI has emerged as an active policy space across Congress and the Biden Administration, as officials scramble to educate themselves on the technology while crafting legislation, rules, and other measures to balance U.S. innovation leadership with national security priorities.

Over the past year, AI issues have drawn bipartisan interest and support.  House and Senate committees have held nearly three dozen hearings on AI this year alone, and more than 30 AI-focused bills have been introduced so far this Congress.  Two bipartisan groups of Senators have announced separate frameworks for comprehensive AI legislation.  Several AI bills—largely focused on the federal government’s internal use of AI—have also been voted on and passed through committees. 

Meanwhile, the Biden Administration has announced plans to issue a comprehensive executive order this fall to address a range of AI risks under existing law.  The Administration has also taken steps to promote the responsible development and deployment of AI systems, including securing voluntary commitments regarding AI safety and transparency from 15 technology companies. 

Despite strong bipartisan interest in AI regulation, commitment from leaders of major technology companies engaged in AI R&D, and broad support from the general public, passing comprehensive AI legislation remains a challenge.  No consensus has emerged around either substance or process, with different groups of Members, particularly in the Senate, developing their own versions of AI legislation through different procedures.  In the House, a bipartisan bill would punt the issue of comprehensive regulation to the executive branch, creating a blue-ribbon commission to study the issue and make recommendations.

I. Major Policy & Regulatory Initiatives

Three versions of a comprehensive AI regulatory regime have emerged in Congress – two in the Senate and one in the House.  We preview these proposals below.

            A. SAFE Innovation: Values-Based Framework and New Legislative Process

In June, Senate Majority Leader Chuck Schumer (D-NY) unveiled a new bipartisan proposal—with Senators Martin Heinrich (D-NM), Todd Young (R-IN), and Mike Rounds (R-SD)—to develop legislation to promote and regulate artificial intelligence.  Leader Schumer proposed a plan to boost U.S. global competitiveness in AI development, while ensuring appropriate protections for consumers and workers.

Continue Reading Spotlight Series on Global AI Policy — Part II: U.S. Legislative and Regulatory Developments

This quarterly update summarizes key legislative and regulatory developments in the second quarter of 2023 related to key technologies and related topics, including Artificial Intelligence (“AI”), the Internet of Things (“IoT”), connected and automated vehicles (“CAVs”), data privacy and cybersecurity, and online teen safety.

Artificial Intelligence

AI continued to be an area of significant interest of both lawmakers and regulators throughout the second quarter of 2023.  Members of Congress continue to grapple with ways to address risks posed by AI and have held hearings, made public statements, and introduced legislation to regulate AI.  Notably, Senator Chuck Schumer (D-NY) revealed his “SAFE Innovation framework” for AI legislation.  The framework reflects five principles for AI – security, accountability, foundations, explainability, and innovation – and is summarized here.  There were also a number of AI legislative proposals introduced this quarter.  Some proposals, like the National AI Commission Act (H.R. 4223) and Digital Platform Commission Act (S. 1671), propose the creation of an agency or commission to review and regulate AI tools and systems.  Other proposals focus on mandating disclosures of AI systems.  For example, the AI Disclosure Act of 2023 (H.R. 3831) would require generative AI systems to include a specific disclaimer on any outputs generated, and the REAL Political Advertisements Act (S. 1596) would require political advertisements to include a statement within the contents of the advertisement if generative AI was used to generate any image or video footage.  Additionally, Congress convened hearings to explore AI regulation this quarter, including a Senate Judiciary Committee Hearing in May titled “Oversight of A.I.: Rules for Artificial Intelligence.”

There also were several federal Executive Branch and regulatory developments focused on AI in the second quarter of 2023, including, for example:

  • White House:  The White House issued a number of updates on AI this quarter, including the Office of Science and Technology Policy’s strategic plan focused on federal AI research and development, discussed in greater detail here.  The White House also requested comments on the use of automated tools in the workplace, including a request for feedback on tools to surveil, monitor, evaluate, and manage workers, described here.
  • CFPB:  The Consumer Financial Protection Bureau (“CFPB”) issued a spotlight on the adoption and use of chatbots by financial institutions.
  • FTC:  The Federal Trade Commission (“FTC”) continued to issue guidance on AI, such as guidance expressing the FTC’s view that dark patterns extend to AI, that generative AI poses competition concerns, and that tools claiming to spot AI-generated content must make accurate disclosures of their abilities and limitations.
  • HHS Office of National Coordinator for Health IT:  This quarter, the Department of Health and Human Services (“HHS”) released a proposed rule related to certified health IT that enables or interfaces with “predictive decision support interventions” (“DSIs”) that incorporate AI and machine learning technologies.  The proposed rule would require the disclosure of certain information about predictive DSIs to enable users to evaluate DSI quality and whether and how to rely on the DSI recommendations, including a description of the development and validation of the DSI.  Developers of certified health IT would also be required to implement risk management practices for predictive DSIs and make summary information about these practices publicly available.


Continue Reading U.S. Tech Legislative & Regulatory Update – Second Quarter 2023

On June 29, 2023, the Federal Trade Commission (“FTC”) posted a blog to its website expressing concerns about the recent rise of generative artificial intelligence (“generative AI”). To get ahead of this rapidly developing technology, the FTC identified “the essential building blocks” of generative AI and highlighted some business practices the agency would consider “unfair methods of competition.” The FTC also underscored technological aspects unique to generative AI that could raise competition concerns.

What is Generative AI?

Traditional AI has existed in the marketplace for years and largely assisted users in analyzing or manipulating existing data.  Generative AI, on the other hand, represents a significant advance with its ability to generate entirely new text, images, audio, and video. The FTC notes that this content is frequently “indistinguishable from content crafted directly by humans.”

What are the “essential building blocks” of generative AI?

The FTC identified three “essential building blocks” that companies need to develop generative AI. Without fair access to the necessary inputs, the FTC warns that competition and the ability for new players to enter the market will suffer.

  • Data. Generative AI models require access to vast amounts of data, particularly in the early phases where models build up a robust competency in a specific domain (for example, text or images). Market incumbents may possess an inherent advantage because of access to data collected over many years. The FTC notes that while “simply having large amounts of data is not unlawful,” creating undue barriers to access that data may be considered unfair competition.


Continue Reading The Federal Trade Commission and Generative AI Competition Concerns

The American Music Fairness Act (“AMFA”) has been re-introduced in the Senate for this Congress.  Sen. Padilla (D-CA) introduced the bill (S.253) earlier this month, along with Sens. Blackburn (R-TN), Tillis (R-NC), and Feinstein (D-CA).  The bill was referred to the Judiciary Committee, on which every cosponsor serves.  Further, Sen. Tillis serves as

On Tuesday, February 14, 2023, the Senate Judiciary Committee held a hearing titled “Protecting Our Children Online.”  The witnesses included only consumer advocates, and no industry representatives.  As Committee Chair, however, Senator Durbin (D-IL) indicated that he plans to hold another hearing featuring representatives from technology companies.

The key takeaway was that there continues to be strong bipartisan support for passing legislation that addresses privacy and online safety for minors.  Both Senator Durbin and Senator Graham (R-SC), the Committee’s Ranking Member, were in agreement that the Committee will mark up relevant legislation, which could happen within the next six months—making the next couple months particularly important for negotiations.  Notably, all of the previously introduced legislation that was discussed had passed at least its respective Senate Committee last Congress.

Senators focused on four bills that could be included as part of a legislative package:

  1. Kids Online Safety Act (KOSA) (to be reintroduced).  KOSA would apply to “covered platforms,” which the previous bill defined as a “commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.”  Among other things, KOSA would impose a duty of care on covered platforms that would require them to “prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials” on the platform.


Continue Reading Senate Judiciary Committee Holds Hearing on Children’s Online Safety

This quarterly update summarizes key legislative and regulatory developments in the fourth quarter of 2022 related to Artificial Intelligence (“AI”), the Internet of Things (“IoT”), connected and autonomous vehicles (“CAVs”), and data privacy and cybersecurity.

Artificial Intelligence

In the last quarter of 2022, the annual National Defense Authorization Act (“NDAA”), which contained AI-related provisions, was enacted into law.  The NDAA creates a pilot program to demonstrate use cases for AI in government. Specifically, the Director of the Office of Management and Budget (“Director of OMB”) must identify four new use cases for the application of AI-enabled systems to support modernization initiatives that require “linking multiple siloed internal and external data sources.” The pilot program is also meant to enable agencies to demonstrate the circumstances under which AI can be used to modernize agency operations and “leverage commercially available artificial intelligence technologies that (i) operate in secure cloud environments that can deploy rapidly without the need to replace operating systems; and (ii) do not require extensive staff or training to build.” Finally, the pilot program prioritizes use cases where AI can drive “agency productivity in predictive supply chain and logistics,” such as predictive food demand and optimized supply, predictive medical supplies and equipment demand, predictive logistics for disaster recovery, preparedness and response.

At the state level, in late 2022, there were also efforts to advance requirements for AI used to make certain types of decisions under comprehensive privacy frameworks.  The Colorado Privacy Act draft rules were updated to clarify the circumstances that require controllers to provide an opt-out right for the use of automated decision-making and requirements for assessments of profiling decisions.  In California, although the California Consumer Privacy Act draft regulations do not yet cover automated decision-making, the California Privacy Protection Agency rules subcommittee provided a sample list of related questions concerning this during its December 16, 2022 board meeting.

Continue Reading U.S. AI, IoT, CAV, and Privacy Legislative Update – Fourth Quarter 2022

Today, the American Music Fairness Act (“AMFA”) will take a step forward as the bill is set for mark up with the House Judiciary Committee.  The Copyright Act provides exclusive rights to publicly perform sound recordings by means of digital audio transmissions (e.g., internet and satellite), and AMFA is the latest attempt to extend such rights to analog audio transmissions (e.g., terrestrial radio). 

Marking up the bill at this late stage of the Congressional term may mean the bill is tacked on to end-of-year spending packages (as with the CASE Act in 2020), or more likely that it will be taken up again next Congress.  With bipartisan and bicameral support of members on the relevant Committees of jurisdiction, AMFA could still move in a divided Congress, making it all the more important for stakeholders to engage now if they want to support or make changes to the bill.

The AMFA Bill

The bipartisan AMFA bill was first introduced in the House on June 24, 2021 (H.R.4130), and its companion Senate bill followed on September 22, 2022 (S.4932).  Rep. Jerry Nadler (D-NY) recently became the House bill’s primary sponsor after its original sponsor Rep. Ted Deutch (D-CA) left Congress.

The bill would amend Section 106(6) of the Copyright Act, which provides the exclusive right to publicly perform sound recordings via “digital audio transmission,” by deleting the word “digital.”  AMFA also attempts to address some criticisms that faced similar predecessor bills.  For example, AMFA proposes low flat fees for certain nonsubscription broadcast transmissions by public or smaller commercial stations, and other fees would be set in rate-setting proceedings before the Copyright Royalty Board.  Such rate-setting proceedings would take account of economic, competitive, and programming information, and whether transmissions substitute for or promote record sales, and interfere with or enhance other revenue streams for sound recording owners. 

Continue Reading CONGRESS TO MARK UP THE AMERICAN MUSIC FAIRNESS ACT

Public Policy

With Senate Democrats having secured the 50th vote needed to maintain control of the Senate,  both parties are eagerly awaiting the results of the Georgia runoff on December 6 between Democratic Senator Raphael Warnock (D-GA) and Republican candidate Herschel Walker.  If Walker wins, the Senate will be split 50-50.  The implications of a 51–49 Democratic majority versus a 50–50 Democratic majority are significant.

An Equally Divided Senate

Since February 3, 2021, the Senate has operated under an organizing resolution negotiated by Majority Leader Chuck Schumer (D-NY) and Minority Leader Mitch McConnell (R-KY).  The organizing resolution formalized a power-sharing agreement for the 117th Congress and was largely modeled on the 2001 power-sharing agreement reached by then-Democratic leader Tom Daschle (D-SD) and then-Republican leader Trent Lott (R-MS) following the November 2000 elections that resulted in a 50–50 Senate split for the 107th Congress.  The 2021 power-sharing agreement laid out internal rules of the Senate, apportioned the makeup and control of committees, and prescribed procedures for the control of Senate business.  Specifically, the 2021 power-sharing agreement provides that:

  • Senate committees be equally balanced with members of both parties;
  • The majority and minority on each committee have equal budgets and office space;
  • If a subcommittee vote is tied on either legislation or a nomination, the committee chair may discharge the matter and place it on the full committee’s agenda;
  • If a committee vote is tied, the Majority or Minority Leader may offer a motion to discharge the measure from committee, subject to a vote by the full Senate;
  • Debate may not be cut off for the first 12 hours; and
  • It is the “sense of the Senate” that both Majority and Minority leaders “shall seek to attain an equal balance of the interests of the two parties” when scheduling and debating legislative and executive business.


Continue Reading Governing the Senate in the 118th Congress

Immediate Reaction

With Republicans only holding a slim majority in the House and the Democrats keeping their majority in the Senate, there is almost universal agreement that President Biden and the Democratic Party as a whole have outperformed expectations.  The President and the White House surely view these results as validation of his approach, his agenda, and his work so far.  A key part of this, which is at the core of his unity agenda and something he reiterated in his speech following this election, is his long-standing commitment to reaching across the aisle.  We can therefore expect the Administration to continue to seek out opportunities to work with Republicans, particularly in areas that garner bipartisan attention such as technology, children, and veterans.  We can also expect judicial nominations to remain a priority, both in the lame duck and in the next Congress, and for the President to continue advancing his agenda by taking Executive action when legally able.

Meanwhile, agencies will continue their work implementing key laws passed by this Congress—including the Bipartisan Infrastructure Law, the Inflation Reduction Act, and the PACT Act—at the same time that they look for new ways to implement the President’s agenda through rulemaking and enforcement.  In particular, it seems likely that the Federal Trade Commission and the Justice Department’s Antitrust Division will become even more active consistent with the Administration’s larger competition agenda. 

A key question moving into the next Congress is how those agency actions will interact with the strain of populism that partially animates efforts in both parties to regulate “Big Tech.”  The push to move certain antitrust legislation during the lame duck is unlikely to materialize; instead, it is likely to morph in the next Congress into a focus on content moderation and amending Section 230 of the Communications Decency Act.  Other priorities—like privacy and child protection, including bills like the Kids Online Safety Act—will almost certainly remain at the top of next year’s agenda if they do not pass as part of a larger spending bill this Congress.    

Continue Reading Midterm Elections: Democratic Reaction

On September 29, 2022, the U.S. House of Representatives passed a package of three antitrust bills (H.R. 3843) by a vote of 242-184. The package includes: (1) the Merger Filing Fee Modernization Act; (2) the Foreign Merger Subsidy Disclosure Act; and (3) the State Antitrust Enforcement Venue Act.

The Merger Filing Fee Modernization Act updates the structure and amounts of premerger filing fees that the Federal Trade Commission (“FTC”) and Department of Justice (“DOJ”) collect pursuant to the Hart-Scott Rodino Antitrust Improvement Act of 1976. The Merger Filing Fee Modernization Act reduces fees for smaller transactions, increases fees for mergers valued at $1 billion or greater, and adjusts the filing fee amounts for each future year based on changes in the Consumer Price Index. Finally, the bill requires the FTC and DOJ to report each year on the total revenue generated from premerger notification filing fees, broken out by tier, and the FTC must also include in the report a list of all actions the agency took or declined to take based on a 3-to-2 vote.

The Foreign Merger Subsidy Disclosure Act requires parties submitting premerger notifications to disclose detailed information on subsidies from a “foreign entity of concern.” A foreign entity of concern is defined under 42 U.S.C. § 18741(a) and includes those designated foreign terrorist organizations, on the Specially Designated and Blocked Persons List, and alleged to be involved in espionage or unauthorized conduct detrimental to the national security or foreign policy of the United States. The definition further covers entities owned by, controlled by, or subject to the direction of the governments of the Democratic People’s Republic of North Korea, the People’s Republic of China, the Russian Federation, or the Islamic Republic of Iran.

Continue Reading U.S. House of Representatives Passes Antitrust Legislative Package