UVA Alum Renders Social Media Victory
National trade association, NetChoice, recently garnered a resounding endorsement of its First Amendment claims against the state of Virginia. I wrote about this case two weeks ago, but in November 2025, NetChoice, which is comprised of powerful Internet commerce companies, sued then-Attorney General Jason Miyares to preclude enforcement of a new provision of the Virginia Consumer Data Protection Act, which seeks to limit minors’ use of social media. [1] Jay Jones has since taken up the mantle against NetChoice, defending the 2025 statute in the Eastern District of Virginia.
The court had previously announced that it would delay its summary judgment ruling until it had made a decision on NetChoice’s motion for a preliminary injunction. [2] At the time, I wondered if that decision would be considered a harbinger of the preliminary injunction ruling, or if it is standard for preliminary injunctions, being putatively urgent in nature, naturally tend to take priority. While I am not well-versed enough in the federal docket to know whether this is something that judges do frequently and therefore should not be ascribed any particular significance, what I can say is that having read the preliminary injunction decision, it would be absolutely stunning if the Attorney General’s motion for summary judgment were to be granted.
Judge Patricia Tolliver Giles, a Biden appointee, Law School alum, and former Assistant United States Attorney, ruled on the motion for preliminary injunction. [3] Judge Giles last year issued a preliminary injunction ordering the Youngkin administration to stop removing voters from voter rolls within the 90-day quiet period mandated by the National Voter Registration Act and restore illegally removed voters. [4]
Judge Giles issued her decision on February 27, in which she not only granted NetChoice their preliminary injunction, but did so with reasoning that read as a wholesale endorsement of NetChoice’s legal theory. [5] In other words, the way the opinion was written, this was not a close call for the Eastern District. I will walk through the decision to explain what I mean by this.
NetChoice first had to prove that it had standing to bring the lawsuit. Although the larger lawsuit makes multiple constitutional claims, the preliminary injunction was decided exclusively on First Amendment grounds. [6] To that end, the court’s first order of business was to assess NetChoice’s standing to bring its First Amendment claims.
What I found particularly notable about NetChoice’s First Amendment claims was that they were made both on its own behalf—i.e., on behalf of its member organizations—as well as on behalf of their adult and minor users. [7] It seems very intuitive that the users of social media platforms might object to their access being potentially restricted, though they are not the plaintiffs here. What is less intuitive is that NetChoice’s own members would argue that their speech is unconstitutionally constrained.
A recent NPR article gets at this question of whether social media companies themselves participate in speech, but it reveals a different perspective than that taken by NetChoice. The article centers on Meta’s curtailment of its content moderation: Mark Zuckerberg addressed Meta’s recent moves to scale back on fact-checking and automated systems, with exceptions for “illegal and high-severity violations.” [8] Meta Chief Global Affairs Officer Joel Kaplan commented that the company would emulate X’s model of “community notes,” in which users comment on and rate specific posts: “I think Elon’s played an incredibly important role in moving the debate and getting people refocused on free expression.” [9]
Meta is a member of NetChoice, alongside X. [10] As it heavily prunes its fact-checking apparatus, Meta also faces a products liability-style negligence lawsuit alleging that Meta and other social media companies concealed their knowledge of their platforms’ addictive natures, in which Zuckerberg personally testified. [11] YouTube, another defendant in the lawsuit, is also a member of NetChoice. [12] While there certainly may be distinctions to be made between the components of social media that cause it to be addictive and the putative First Amendment speech by these companies, it is a thin and perhaps blurry line. Judge Giles acknowledges in her opinion that “minors are particularly susceptible to its [social media’s] addictive features.” [13] The question arises: What is the precise relationship between NetChoice members’ alleged speech and the allegedly addictive nature of their products? To what extent can claiming speech, which is presumably intended in some manner, evidence of culpability for addictive products?
Source: Netchoice.org
These questions remain unanswered. What we do have an answer to, however, is whether the Eastern District was visibly troubled by them: the answer appears to be “no.” Judge Giles quickly disposes of the issue of whether these corporations are likely engaged in protected speech by making the conclusory statement that where a government regulation forbids some action by the plaintiff, “standing is usually easy to establish.” [14] The opinion elaborates later, adding that NetChoice members curate and disseminate content “that they deem to be most valuable to users and that complies with their rules about content allowed on their services.” [15] Meta, Google, and others make, according to the opinion, value judgments about what to share with their users, which qualifies them as First Amendment speakers. The opinion quotes YouTube’s testimony that its ability to share content “that YouTube thinks will be particularly relevant to them.” [16] Limiting NetChoice members’ ability to express themselves via content curation is, per se, an injury.
With weighty questions lurking in the background, the court disposes of standing (the court found third-party standing with respect to social media users) and moves on to the meat of the issue: likelihood of success on the merits. NetChoice’s resounding (if reversible) victory radiates from the reasoning in this section.
The court found that social media restrictions would fail both strict and intermediate scrutiny, but heavily favored subjecting the statute to strict scrutiny. [17] This is the part of the opinion with which I struggle. The level of scrutiny to which the law is subjected depends on whether it is content-neutral. Content-based laws require strict scrutiny, while content-neutral laws are routed through intermediate scrutiny. [18] As a brief refresher, the statute in question requires that social media companies (subject to some requirements) identify minor users and limit them to one hour of use per day, with the option for parents to adjust the time limit. [19] The statute does not limit access to particular types of content within the social media platform, instead it acts as a blanket bar to any social media use by minors beyond one hour, unless parents decide to change the limit.
Reading the statute, it appears to me to be content-neutral. This is not a ban on obscene material, political material, content related to self-harm, etc. The Eastern District construes the law differently, finding that it is content-based. [20] The rationale for this finding is based on the statute’s definition of social media companies, which excludes platforms whose purpose is not to be a social media platform but still have certain social media-like features. [21] Part of the language for this exclusion lists types of services that are not social media and are therefore excluded: “news, sports, entertainment, [and] ecommerce.” [22] The Eastern District interprets this part of the definition to mean that users' access to that type of content is unrestricted, whereas access to other content is unrestricted. [23]
The court’s reasoning here conflates the statute’s jurisdiction with its function. The jurisdiction of the statute is social media companies. The function is to limit access to social media companies’ platforms. The provision of the statute on which the court focuses establishes jurisdiction: it establishes which types of platforms to which minors have restricted access. The court applies this language to the function, finding that minors’ access to news or sports is less burdened than their access to other unnamed types of content. There is no evidence adduced, however, that a social media time limit is pretextually limiting access to some other type of content that is less attainable via other avenues than news or sports.
The Eastern District’s decision has since been appealed to the Fourth Circuit, with opening briefs due mid-April. [24] Although Judge Giles did find that the state had a compelling interest in passing the statute, the decision sided against the Attorney General at all other junctures in the analysis. [25] In the court’s own words, “The issues in this matter are not to be taken lightly.” [26] We shall just have to wait and see how the case develops.
NetChoice v. Jason S. Miyares, Docket No. 1:25-cv-02067 (E.D. Va. Nov 17, 2025)
Id. Docket Item No. 48
Virginia Coalition for Immigrant Rights, et al., v. Beals, Docket No. 1:24-cv-1807 (E.D. Va. Oct. 25, 2024)
NetChoice v. Jason S. Miyares, Docket Item No. 50
Id.
Id. at 7
Id.
Supra 10, 11
Supra 5 at 2
Id. at 7
Id. at 8
Id.
Id. at 13-24
Id. at 14
Va. Code § 59.1-577.1
Supra 5 at 17
Va. Code § 59.1-575
Id.
Supra 5 at 16
NetChoice v. Jay Jones, Docket No. 26-01252 (4th Cir. Mar 06, 2026)
Supra 5
Supra 5 at 27