Connect with us

Business

Carla Hayden was the first woman and African American to be Librarian of Congress before Trump ousted her. Two months later, she’s landed on her feet

Published

on



The Andrew W. Mellon Foundation exclusively told The Associated Press that Carla Hayden will join the humanities grantmaker Monday as a senior fellow whose duties will include advising on efforts to advance public knowledge through libraries and archives.

The year-long post places Hayden back at the center of the very debates over American culture that surrounded her dismissal. The White House ousted Hayden, the first woman and the first African American to hold the title, after she was accused of promoting “radical” literary material by a conservative advocacy group seeking to squash Trump opposition within the federal government.

Hayden acknowledged existing threats to “the free exchange of ideas” in a statement to the AP.

“For generations, libraries, archives, and cultural institutions have been the guardians of knowledge and the catalysts for human progress,” she said. “Together, we will work to strengthen the public knowledge ecosystem and ensure that the transformative power of information remains accessible to all.”

Mellon’s response to government funding cuts

Meanwhile, the Mellon Foundation has been working to fill fiscal holes for arts communities reeling from federal cuts. Its $15 million “emergency” fund aims to offset the $65 million that were supposed to go to the state humanities councils that organize book fairs, heritage festivals, theater productions and other programs fostering cultural engagement.

The foundation has previously supported the American Library Association’s efforts to counter book bans, increase scholarships for librarians of color and boost adult literacy.

Mellon President Elizabeth Alexander said the foundation is thrilled to welcome Hayden, “a leader with an unshakable regard for the public good of the American people,” during such a “crucial time.” Public knowledge institutions are navigating “historic challenges and transformative advances,” according to Mellon, including artificial intelligence, digital technologies, federal funding withdrawals and censorship efforts.

Who is the Librarian of Congress and what is the job?

The acting librarian is Deputy Attorney General Todd Blanche, who represented Trump during his 2024 criminal trial.

Responsibilities range from looking after collections to selecting the country’s poet laureate to awarding the Gershwin Prize for Popular Song and the Library of Congress Prize for American Fiction. The library also manages the nonpartisan Congressional Research Service.

Librarians serve 10-year terms, and Hayden’s was scheduled to end in 2026. Her tenure included modernizing its reservoir of the nation’s books and history. She oversaw new initiatives reaching out to rural and online audiences. Recent campaigns sought to improve accessibility for everyday visitors. And she arranged for Lizzo’s 2022 performance where the artist played a crystal flute owned by President James Madison — among the Library’s troves of artifacts.

Before her confirmation in 2016, Hayden spent more than two decades as CEO of Baltimore’s Enoch Pratt Free Library system and was president of the American Library Association from 2003 to 2004. A graduate of Roosevelt University and the University of Chicago, she is a member of the American Philosophical Society and the American Academy of Arts and Sciences.



Source link

Continue Reading

Business

Want a job in AI-era tech? Forget prestigious degrees—tech leaders want to see your GitHub projects and internships

Published

on



For decades, computer science has been sold as one of the surest paths to economic security. And leaders across politics and industry—from former President Bill Clinton and Secretary of State Marco Rubio to Steve Jobs and Bill Gates—have at times urged students not to overlook the field, framing coding skills as the secret to stable, high-paying jobs.

But as artificial intelligence rapidly reshapes the workplace, that promise is starting to look less certain.

A new survey of more than 200 engineering leaders, conducted by tech training nonprofit CodePath and shared exclusively with Fortune, shows entry-level tech hiring is slowing. More than one-third of respondents, 38%, said their company has reduced the number of entry-level hiring over the past year, and nearly 1 in 7 reported pausing Gen Z hiring altogether.

At the same time, 18% said hiring had stayed the same, and 8% reported an increase. Despite the overall slowdown, CodePath CEO Michael Ellison—a Y Combinator alum—argues telling people to avoid tech right now would be a mistake.

“That’s just kind of like taking crazy pills if you end up choosing not to invest in the tools that make you the most powerful—of telling computers what you want them to do in an age where computers are becoming exponentially more powerful,” Ellison told Fortune. “So to me, it’s like saying, ‘don’t learn how to use the internet.’”

Ellison’s argument reflects a broader shift in how computer science fits into the AI economy. As generative AI tools become more capable, understanding how software works—and how to direct, customize, and integrate AI systems—is increasingly seen as a foundational skill rather than a specialized one.

That demand is already showing up in the labor market. AI literacy topped LinkedIn’s list of the skills professionals are prioritizing and companies are hiring for right now. And a Lightcast analysis of more than 1.3 billion job postings in 2024 found roles advertising at least one AI or generative AI skill offered an average of $18,000 more in annual compensation that those that did not.

Notably, the majority of those roles were outside the tech sector. Some 51% of jobs requiring AI skills were in non-tech industries, up from 44% in 2022—a sign coding and AI fluency are becoming relevant far beyond Silicon Valley.

The new secret to landing a tech job

Still, slowing hiring doesn’t mean aspiring technologists should give up. Instead, the CodePath data suggests candidates may need to rethink what they emphasize—and what they leave off—when applying for tech roles.

When asked which signals matter most outside the interview process, engineering leaders indicated proof of real-world skills matter far more than formal credentials. Side projects or portfolios topped the list, cited by 38% of respondents, followed by internship experience (35%), and public code portfolios like GitHub (34%).

Traditional markers of achievement, by contrast, carried far less weight. Just 4% of leaders said credentialing programs were a top influence in hiring decisions, while only 23% cited a candidate degree or academic focus and 17% pointed to school prestige.

The shift away from pedigree suggests employers are seeking evidence candidates can actually do the work. Greater fluency with AI tools and frameworks was the most common skill expectation for early-career hires, followed by faster time to writing production-ready code and the ability to learn new tools or programming languages quickly.

And despite buzz about tech layoffs, job opportunities do still exist. The U.S. federal government, for example, recently announced it would be hiring about 1,000 new engineers, data scientists, and AI specialists. No degrees or work experience is required—and salaries will range from $150,000 to $200,000. Meta has also still been hiring young talent in recent weeks, with job postings for roles such as product software engineers.

Ellison’s advice for those seeking roles is simple: Opportunities are out there as long as you are willing to dig in deeper—and build a portfolio that hiring managers are looking for.

“People are rewarded for being aggressive and for going after what they want,” he said. It’s surprising the opportunities that are hidden in plain sight.”

This story was originally featured on Fortune.com



Source link

Continue Reading

Business

Down Arrow Button Icon

Published

on



You’re not imagining it: The AI job squeeze isn’t some future apocalypse, it’s already quietly underway. 

Professor Yoshua Bengio spent four decades building the technology that is now coming for your job. He is a computer science professor at the Université de Montréal, a Turing Award winner, and one of the most-cited scientists in the world on Google Scholar—and now he’s turned his back on his life’s work to warn that your job is probably already under threat. 

Desk jobs, or as Bengio called them, “cognitive jobs, the jobs that you can do behind a keyboard,” will be the first casualties of automation. 

“It’s just a matter of time,” the AI pioneer stressed on Steven Bartlett’s Diary of a CEO podcast.

“Unless we hit a wall scientifically, like some obstacle prevents us from making progress to make AIs smarter and smarter, there’s going to be a time when they’ll be doing more and more, able to do more and more of the work that people do … And then, of course, it takes years for companies to really integrate that into their workflows, but they’re eager to do it. So it’s more a matter of time than, is it happening or not?”

And he admitted that it’s Gen Z new-hires who are currently being hit hardest by AI, as junior roles are the easiest to cut, consolidate, or backfill with software—but eventually everyone’s jobs will be impacted within five years.

It’s not just office jobs that are at risk; even trade jobs and democracy itself are threatened 

For years, degrees were pushed as the key to success for the young and aspirational looking to nab well-paying and stable jobs. But now, even highly educated students are finding themselves “unemployable” as employers launch a “wait-and-watch strategy” in the midst of AI. Graduates in the U.K. are facing the worst job market since 2018. And companies like Intel, IBM, and Google have been freezing thousands of would-be new roles that AI is expected to take over in the next five years.

But it’s not just a blip or a reflection of the current economy, Bengio warned. As more firms lean on AI and eventually robots, too, the technology will only get smarter, he said. 

“As companies are deploying more and more robots, they will be collecting more and more data. So eventually, it’s going to happen,” Bengio said when asked whether AI will be able to wipe out all work. Even young people trying to outsmart automation by ditching degrees or upskilling into trade jobs are destined for the same dead end.

“So if you do a physical job—as Geoffrey Hinton is often saying, you should be a plumber or something—it’s going to take more time [for AI to replace your job], but I think it’s only a temporary thing.” 

Now, knowing the devastation AI could cause, Bengio said he regrets his life’s work. 

“I should have seen this coming much earlier, but I didn’t pay much attention to the potentially catastrophic risks,” the 61-year-old admitted. “But my turning point was when ChatGPT came, and also with my grandson, I realized that it wasn’t clear if he would have a life 20 years from now, because we’re starting to see AI systems that are resisting being shut down.”

He’s since founded LawZero, a nonprofit organization focused on building safe and human-aligned AI systems. But at the current rate of change, his warning is clear: It’s not just jobs, even democracy could collapse in as little as two decades.

His message for CEOs? “Step back from your work. Talk to each other, and let’s see if together, we can solve the problem. Because if we are stuck in this competition, we’re going to take huge risks that are not good for you, not good for your children.”

Join us at the Fortune Workplace Innovation Summit May 19–20, 2026, in Atlanta. The next era of workplace innovation is here—and the old playbook is being rewritten. At this exclusive, high-energy event, the world’s most innovative leaders will convene to explore how AI, humanity, and strategy converge to redefine, again, the future of work. Register now.



Source link

Continue Reading

Business

Down Arrow Button Icon

Published

on



If it wasn’t for a Volkswagen bus and a calculator, Apple might never have existed. At the time, the late cofounder Steve Jobs was in his early twenties and strapped for cash, but hooked on the idea that everyone should be able to own a home computer. The only problem? Like many founders, he didn’t have enough money to bring his vision to life.

So Jobs sold off his Volkswagen bus while fellow cofounder Steve Wozniak got money for his programmable calculator, raising $1,300 to pay for the prototype’s parts. And the first Apple computer, the Apple I, was born on April Fools’ Day, 1976.

The sacrifice paid off: A local computer dealer placed a $50,000 order for 100 units soon after it launched, with the product mainly bought up by hobby enthusiasts. But it made the entrepreneurial duo enough money to create Apple II for the mass market—the first personal computer to include a keyboard and color graphics. A year after its 1977 debut, it made nearly $3 million. 

“I was worth about over $1 million when I was 23, and over $10 million when I was 24, and over $100 million when I was 25,” Jobs told PBS in 1996. “And it wasn’t that important, because I never did it for the money.”

The days of selling their belongings to fund their fledgling business was long behind them.

From college dropout to $10.2 billion net worth: Jobs’ path to Apple success

Jobs didn’t discover his passion for technology in a college class; at age 12, the entrepreneur had already found his true calling, and took a massive leap of faith to pursue his dreams. 

A young Jobs thumbed through the yellow pages, and hunted down the phone number of Hewlett-Packard cofounder Bill Hewlett, ringing him up for a favor. At the time, the tween was in need of spare parts to build a frequency counter. But what he received was far better than some nuts and bolts; Hewlett offered Jobs an internship at the iconic $21.4 billion tech company, where he serendipitously met a talented engineer: Wozniak. 

Together, the pair started their first business, illegally selling “blue boxes” that allowed users to make free, long-distance telephone calls. Jobs reminisced about those years in the early 1970s as a “magical” time in his life that sent him on the path to soon create Apple. 

“Experiences like that taught us the power of ideas,” Jobs said in the 1998 documentary Silicon Valley: A 100-Year Renaissance. “If we hadn’t … made blue boxes, there would have been no Apple.”

Jobs later enrolled at Reed College in Portland, Ore., but his days of higher education were short-lived. He dropped out after just one semester, inevitably working for legendary brand Atari as a technician and games designer at just 18 years old. That would be the last time Jobs worked under somebody else; just two years later, Apple I hit the market, and Jobs was well on his way to becoming one of the most visionary tech pioneers in modern history. 

Fast-forward five decades later, and Apple is the second most valuable company in the world. The business sits in fourth place on the Fortune 500, having sold more than 3 billion iPhones, and boasting more than 100 million Mac users globally. 

At the time of his passing in 2011, Jobs was estimated to be worth $10.2 billion. Although he had enough money to buy a whole fleet of luxury cars shortly after founding Apple, selling his Volkswagen proved to be a critical sacrifice in making it to the top.



Source link

Continue Reading

Trending

Copyright © Miami Select.