I.

Penn State’s president, Neeli Bendapudi, published a Forbes article titled “Why People Are The Smartest AI Investment—And What It Means For Universities.” She argues that companies should hire “AI-ready graduates” because they “reduce onboarding costs, strengthen teams and accelerate innovation.” She closes by claiming “the organizations that thrive in the era of AI will not be defined by the tools they purchase. They will be defined by the people they hire now and empower to grow into tomorrow’s leaders.”

The LinkedIn post promoting the article collected 288 reactions and 16 comments. The commenters held titles like Director, Vice President, Dean, Managing Partner, Retired Four-Star General. One promoted a new research center at Penn State. Another pivoted within three sentences to selling AI security software. A third listed “Supercharged Ambassador” and “CHIEF Member” in her credential line and summarized World Economic Forum competency frameworks.

No students commented. No recent graduates struggling with the job market Bendapudi herself describes. No one displaced by the AI tools under discussion. Directors, VPs, deans, and generals talked about a product, in public, in front of the product, without addressing the product.

The product is you, if you’re a Penn State student paying tuition right now.

Bendapudi acknowledges a real problem in the article. She notes that “some companies are reducing early-career roles under the assumption that AI will absorb traditional entry-level tasks.” She cites a McKinsey survey: 48% of employees rank training as the most important factor for AI adoption, while 22% receive zero or minimal support. 2025 graduates faced one of the toughest job markets in a decade.

Her solution: universities should “embed AI literacy across disciplines” so that graduates arrive pre-equipped to use AI tools in the workplace.

That framing does something specific I want to trace through its mechanics.

II.

A student pays tuition. The university uses that revenue to fund instruction, facilities, administration, and executive compensation. In exchange, the university trains the student in skills employers need. An employer hires the graduate and captures productive value from those skills at a rate exceeding the graduate’s salary. The spread is profit.

The university sits between student and employer as a middleman. It sells the employer a pre-trained worker. The student finances the training. The employer captures the surplus.

In stable times, you accept this deal because the skills hold value across a career. You pay $80,000 for training that generates $2 million in lifetime earnings. The university and the employer both take their cut, but the net return to you is positive. The deal works because skill depreciation is slow. An accounting degree from 1995 remained useful in 2015. A mechanical engineering curriculum from 2005 still applied in 2020. Your education outlasted your loans.

AI compresses the shelf life.

III.

A student who enrolled in 2023 to study data analysis learned tools and methods current in 2023. By 2026, automation had replaced or augmented most of those tools. The curriculum committee meets once a year. The technology turns over once a quarter. You arrive at graduation carrying debt priced against a four-year-old snapshot of a field that reorganized itself three times while you were in school.

Penn State charged full tuition for all four years. You bore the depreciation risk. The university locked in revenue at enrollment.

All universities operate this way. Bendapudi’s article does something more specific: she frames the rapid depreciation of skills as a sales pitch to employers rather than a problem for students.

Her key paragraph:

“AI-ready graduates can bring skills that reduce onboarding costs, strengthen teams and accelerate innovation. They can help companies translate emerging technologies into measurable outcomes.”

Take each phrase from the student’s side of the table.

“Reduce onboarding costs”: you arrive pre-trained, so the company spends less getting you productive. The university absorbed that training cost. Your tuition funded it. You paid Penn State $80,000 so that a corporation could spend less integrating you into their workflow.

“Strengthen teams”: in the context of AI deployment, this means fewer people producing more output. A team of four that matches the output of a former team of ten is “stronger” by the metric a CFO cares about. The six absent people don’t appear in the sentence.

“Accelerate innovation”: ship faster with less labor. You operate the AI system that compresses timelines. You serve as the human interface between the model and the production environment.

“Translate emerging technologies into measurable outcomes”: convert AI capabilities into profit. “Measurable outcomes” is finance language. Revenue, margin, cost reduction. You function as a translation layer between software and a balance sheet.

Each phrase frames you as an input. Raw material that universities refine and companies consume. You fund the refining. The company captures the output. Bendapudi presents this as a “shared commitment between higher education and business leaders.” It is shared. You are not a party to the deal.

IV.

Two things changed.

First, Bendapudi acknowledges that AI eliminates entry-level roles, then proposes that universities train graduates harder for the remaining positions. This is a competitive positioning move. If ten universities each produce better AI-ready graduates, they compete more effectively against each other for a shrinking pool of jobs. Total labor demand still falls. Individual programs can win market share within a contracting market, but the contraction continues regardless.

Bendapudi optimizes Penn State’s position within a framework she acknowledges is deteriorating. She can’t write “the economic model underpinning four-year universities faces existential pressure from the technology we teach students to use” in Forbes, because that tanks enrollment. So she writes the version where universities solve the problem instead of suffering from it.

Second, the speed of skill depreciation breaks the traditional return-on-education calculation. An $80,000 degree that trained you for a 30-year career worked even with large university and employer margins. A degree that trains you for a role with a shorter, less predictable window before AI disruption works less well. The university still charges the 30-year price. The labor market operates on an accelerated clock. You absorb the difference.

V.

The counterargument is partially right, and I should give it room.

Universities sell more than vocational training. They provide socialization, credential signaling, network access, intellectual development, and time to mature. These benefits don’t depreciate the way technical skills do. A Penn State alumni network functions in 2035 regardless of what AI does to data analysis jobs. Living independently, building friendships under structured conditions, encountering unfamiliar ideas: that has real developmental value.

Bendapudi’s article doesn’t make this argument. She doesn’t pitch Penn State on developmental value, networks, or socialization. She pitches it on workforce readiness. She addresses employers, not students. The article’s audience is corporate decision-makers choosing where to source talent. The graduate is the product being sold. The pitch: Penn State’s product reduces your costs.

If the value of a university degree rests on non-vocational benefits like networks, socialization, and personal growth, then the vocational pricing model (charge $80,000 because the degree generates $2 million in career earnings) loses its foundation. You’d need to price the degree against the value of the non-vocational benefits alone, and that’s a harder sell at current tuition levels.

VI.

Scott Alexander has written about how institutional actors coordinate narratives that serve institutional interests while believing they’re doing something else. The LinkedIn comments section is a clean example.

Keith Cheng, a Director at Penn State, uses his comment to promote a new Center for Computational Phenomics. He frames this as consistent with Bendapudi’s vision. It is. The center creates faculty positions, attracts grants, and produces graduates trained in a specialty. Cheng and his colleagues probably believe in the mission. His comment still functions as a product line announcement underneath a sales post. Institutional stakeholders signal to each other that they share a framework, reinforce each other’s positioning, and maintain a consensus that serves all of their interests. The people the consensus doesn’t protect weren’t in the thread.

VII.

I don’t have a solution. I should be honest about that rather than constructing one to satisfy the structure of a blog post.

I build AI systems. My company benefits from the dynamics I’ve described. I deploy AI tools that automate workflows, and when I do, the value that used to flow to the people performing those workflows redirects toward my company and its investors. I know this. I do it anyway, because if I don’t, a competitor does, and I’ve surrendered my position for zero change in outcome.

A university can’t stop training AI-ready graduates because peer institutions will capture its market share. A company can’t stop deploying AI because competitors will undercut it on cost. A student can’t opt out of the university system because the credential remains a hiring filter. An AI founder can’t stop building because someone else builds it instead. Each actor responds rationally to local incentives. The aggregate outcome serves no one except capital owners.

I can name the dynamic so that you navigate it with open eyes rather than absorbing institutional narratives designed to keep you buying.

If you’re a student: “AI-ready” in Bendapudi’s sense means “useful to employers deploying AI.” That differs from “positioned to benefit from AI.” You benefit from AI by owning equity in AI systems. The university trains you to operate those systems for a salary, because that’s what it sells. Training you to own the systems would make you a competitor to its corporate partners rather than a product for them.

If you’re a recent graduate: the Forbes article was written for the people who might hire you, not for you. Your interests and theirs overlap on the narrow question of whether you get a job. They diverge on how much of the value you produce flows to you versus to the company and its shareholders. “Reduce onboarding costs” is an employer benefit.

If you’re a university administrator: I recognize that you can’t say any of this publicly without damaging enrollment. I recognize that the incentive structure makes honesty about these dynamics almost impossible within institutional channels. I recognize that Forbes Council posts serve a legitimate marketing function. I’m not asking you to torpedo your own institution. I’m asking you to stop framing the narrative as serving students when it serves employers and universities at student expense.

VIII.

Bendapudi closes: “The organizations that thrive in the era of AI will not be defined by the tools they purchase. They will be defined by the people they hire now and empower to grow into tomorrow’s leaders.”

I’m a Penn State student who builds AI systems, and I don’t have a better story yet. But I think we owe each other enough honesty to admit the current one is breaking.