House Committee Approves Limited AI Regulations
The North Carolina House Judiciary 3 committee held a hearing on House Bill 933, the AI Regulatory Reform Act, and gave a favorable report to the next committee in the process.
What the legislation does:
This legislation marks the first time the General Assembly is tackling some of the negative impacts of AI.
The legislation works to limit the distribution of deepfakes with the intention to harass, extort, threaten, or cause harm to an individual. This legislation creates both criminal and civil liabilities for creating deep fakes to harm people.
North Carolina law already forbids utilizing sexual and obscene images to harass individuals.
The second component of the legislation would provide immunity for AI developers from errors that arise when a learned professional utilizes AI when helping a client.
For example, a lawyer cannot sue the AI developer if they tell a client incorrect legal advice, which the lawyer derived from ChatGPT.
Tim’s Take: This marks the General Assembly’s first legislative effort to regulate some of the harms of AI and to encourage future development. The issue will become increasingly important as North Carolina emerges as an AI hotspot. It will probably attract a certain level of scrutiny from First Amendment advocates since it directly prohibits deep fakes of candidates during election season, unless the deepfake is of newsworthy value, including commentary, criticism, satire, or parody.