State Senate: Republican majority
State House: Republican majority
Governor: Kevin Stitt (R)
Attorney General: Gentner Drummond (R)
Summary:
Deep-red Oklahoma has recently considered bills to allow censored social media users to sue the platforms, require social media companies to publicize their algorithms and terms of service, limit social media use by minors, limit the use of generative-AI in elections, and create penalties for websites with harmful material, including child pornography and requirements for age-verification, the latter of which became law in April 2024.
Of note is HB 3188, a 2022 bill that would have prohibited the State of Oklahoma and any entity receiving state funds from interfering with online free speech. It was introduced by Rep. Logan Phillips [R] in February 2022 but did not progress.
Key Policymakers:
- Sen. Jerry Alvord [R] and Rep. Toni Hasenbeck [R]
Legislative Activity:
SB 1996: Provides that social media sites operating in the state shall be subject to a private right of action by a social media website user if the social media website purposely deletes or censors a user’s political or religious speech or uses an algorithm that deletes political or religious speech. Damages can include up to $75,000.00 per intentional deletion or censoring. If the platform censors a candidate for statewide office, the company shall be fined $250,000.00 per violation per day. If the platform censors a candidate for any other office, the company shall be fined $25,000.00 per violation per day. The measure also prohibits the use of a user’s alleged hate speech as a basis for justification or defense to the action against the social media website at trial. The Attorney General may bring a civil cause of action under this section on behalf of social media website users who reside in the state. Introduced by Sen. Rob Standridge [R] in Feb. 2024 but has not progressed.
HB 2548: Creating the Oklahoma Social Media Transparency Act of 2023; establishing industry requirements for social media companies; requiring publication of standards; requiring consistent application; directing for notice; providing guidelines for censorship or shadow bans; requiring that certain information be provided to users; mandating that users be provided post-prioritization and shadow ban algorithm opt-outs; requiring annual notice; prohibiting post-prioritization and shadow banning algorithms used on political candidates; requiring social media platforms to allow deplatformed users to access certain information; prohibiting censorship, deplatforming, or shadow banning of journalistic enterprises; providing requirements of notifications; Exempting social media platforms from requirements if censored content is obscene; permitting investigation by the attorney general; creating a private cause of action. Introduced by Rep. Terry O’Donnell [R] in Feb. 2023 but did not progress.
SB 1019 (2021): Creates a $10,000.00 fine for each instance of a social media company removing or censoring a post when such action is not required by law. Any entity that has a business facility or subsidiary in this state and receives any tax break, subsidy, exemption, or incentive shall be ineligible to receive such benefits for any year in which they have engaged in censorship activities. Introduced by Sen. Nathan Dahm [R] and Rep. Jay Steagall [R] in Feb. 2021 and was reported favorably out of committee, but did not advance further.
HB 3188 (2022): Prohibiting interference by the State of Oklahoma with online free speech; prohibiting interference with online free speech on the part of entities receiving funds from the State of Oklahoma. Introduced by Rep. Logan Phillips [R] in Feb. 2022 but did not progress.
HB 3914: Places age restrictions on social media usage. Social media companies may not allow minors under 16 to use social media and must require parental consent for minors 16 or older and use age verification. Companies who violate this regulation are subject to a $2,500 fine per violation plus court costs, attorney fees, and damages. The Attorney General is also authorized to take legal action. Additionally, commercial entities are prohibited from retaining user information once the user is granted access to the social media platform. Introduced by Rep. Chad Caldwell [R] and Sen. Ally Seifried [R] in Feb. 2024 and passed the House largely along partisan lines in March.
SB 1655/ HB 3825: Prohibits any person or entity from distributing synthetic media messages that the person or entity knows are deceptive and fraudulent deepfakes of candidates or parties on the state or local ballot within 90 days prior to the election. Synthetic media messages are images, audio recordings, and video recordings of an individual’s appearance, speech, or conduct that has been created or intentionally manipulated with the use of generative adversarial network techniques or other digital technology in a manner to create a realistic but false image, audio, or video. Such messages may be distributed if a disclosure is included in a manner outlined in the measure. Candidates whose appearance is depicted in such media that do not disclose their nature may seek injunctive or other equitable relief. Materially deceptive audio or visual media that constitutes satire or parody are excluded. Introduced by Sen. Dave Rader [R], Rep. Arturo Alonso-Sandoval [D], and Rep. Jeff Boatman [R] in Feb. 2024 and passed the Senate unanimously in March.
SB 1959 (Signed into Law): Provides that any commercial entity that knowingly and intentionally publishes or distributes obscene material, or material that depicts or promotes child pornography or child sexual exploitation on the Internet may be held liable to an individual for nominal damages, actual damages, punitive damages, court costs, and reasonable attorney fees as ordered by the court. The measure requires commercial entities to provide Internet service subscribers and cellular service subscribers the opportunity to request that access to material deemed harmful to minors be denied. After receiving the request, the commercial entity shall block access to its website on any device seeking to access its website using the subscriber’s Internet service or cellular service subscription so that a minor does not receive material harmful to minors via that subscription. Commercial entities that fail to comply with the request shall be held liable for damages to the minor. The commercial entity shall not be held liable if it uses reasonable age verification measures to restrict access to the site. Introduced by Sen. Jerry Alvord [R] and Rep. Toni Hasenbeck [R] in Feb. 2024, passed both chambers largely along partisan lines, and signed into law on April 26, 2024.
Legal Actions:
Attorney General Drummond Urges SCOTUS to Take Section 230 Case. On April 22, 2024 Attorney General Drummond joined a coalition of AGs urging the United States Supreme Court to consider a case that could dramatically limit the immunity that shields Big Tech companies from civil lawsuits. The case, Doe v Snap, involves a young man seeking to hold Snapchat accountable for its role in his sexual abuse. An amicus brief filed by Drummond and 22 other states argues that Section 230 has been misinterpreted by lower courts and asks the U.S. Supreme Court to take the case and realign the law with its text and intended purposes.
A Texas district court dismissed the suit early and a panel of the U.S. Fifth Circuit Court of Appeals affirmed, both courts indicating they were bound by precedents providing broad immunity under Section 230. In a strong seven-judge dissent from denial of rehearing en banc, Judge Jennifer Walker Elrod wrote:
“Power must be tempered by accountability. But this is not what our circuit’s interpretation of Section 230 does. On the one hand, platforms have developed the ability to monitor and control how all of us use the internet, exercising a power reminiscent of an Orwellian nightmare. On the other, they are shielded as mere forums for information, which cannot themselves be held to account for any harms that result. This imbalance is in dire need of correction by returning to the statutory text.”
In the recently filed amicus brief, the attorneys general note, “Plaintiffs have gone after platforms for their role in sex trafficking and abuse, the proliferation of child pornography, cyberbullying and harassment, terrorism, trafficking illegal drugs and guns, and more. Courts have mostly blocked such lawsuits under section 230 – largely at the pleadings stage, when a plaintiff’s allegations, in all their horror, are taken as true …. As companies have racked up victory after victory, year after year, they have become increasingly brazen in condoning and aiding dangerous and illegal conduct on their platforms.”
Attorney General Drummond is joined in the brief by the attorneys general of Alabama, Alaska, Arkansas, Florida, Georgia, Idaho, Iowa, Kansas, Louisiana, Mississippi, Missouri, Montana, Nebraska, New Hampshire, New Mexico, North Dakota, Ohio, Pennsylvania, South Dakota, Texas, Utah and the District of Columbia.
Lawsuit Against Meta for Knowingly Damaging Youth Mental Health. On Oct. 24, 2023 Attorney General Gentner Drummond filed a lawsuit against Meta for knowingly designing and deploying harmful features on Instagram, Facebook and its other social media platforms that are addictive to children and teens. This lawsuit is one of several filed across the nation in state and federal courts. A federal complaint filed in the U.S. District Court for the Northern District of California was joined by 33 states. Oklahoma is one of eight states filing lawsuits in their own state courts, along with the District of Columbia.
The complaint alleges that Meta knew of the harmful impact of its platforms on young people and that instead of taking steps to mitigate these harms, the company misled the public about the dangers associated with it. While much of the complaint relies on confidential material not yet available to the public, available sources – including those previously released by former Meta employees – detail that Meta profited by using algorithms that push young users into descending “rabbit holes” to maximize engagement.
The suit alleges that Meta knew these addictive features harmed the physical and mental health of young people, including body dissatisfaction, negative social comparisons and undermining their ability to get adequate sleep. The impact of Instagram is particularly devastating for girls, according to Meta’s own research.The lawsuit alleges that Meta’s actions violate the Oklahoma Consumer Protection Act. The complaint seeks injunctive and monetary relief to rectify the harms caused by these platforms.