
Ringing in the new year is a time for celebration, one often filled with feelings of hope, gratefulness, and reflection on the year just passed.
But come that Monday after New Year’s, reality sets in. Students return to school, and business beckons us back from the break.
On social media, the looks back at 2025 have mostly ended; gone, too, are pics of revelers smiling as the clock struck midnight, and all the cute, cozy posts by the fire. Today we’re reenergized, rejuvenated, ready to get down to work—and perhaps at least a little worried.
Much of that worry, for businesses and individuals who rely on social media for exposure and income, is blamed on AI. “After more than a decade of growth and addictive engagement, the combination of low-quality AI content, hostile advertising, and user fatigue is set to make social platforms ‘less compelling’ in 2026,” read an article in C21Media. Mental health impact on users, partisan ownership of platforms, and the increasing difficulty in detecting fake photos and videos are among the issues swirling around social media as we enter the new year.
Instagram chief Adam Mosseri addressed this topic, and how users can cope with the challenges posed by the rise of AI, in a New Year’s post. “Everything that made creators matter—the ability to be real, to connect, to have a voice that couldn’t be faked—is now accessible to anyone with the right tools. Deepfakes are getting better. AI generates photos and videos indistinguishable from captured media,” Mosseri wrote in an all-text carousel post on Instagram (could this be a new trend, we have to wonder).
“Authenticity is becoming a scarce resource, driving more demand for creator content, not less,” the post continued. “The bar is shifting from ‘can you create?’ to ‘can you make something that only you could create?'”
Mosseri went on to advise creators to steer clear of content that’s too perfect, too polished. “In a world where AI can generate flawless imagery, the professional look becomes the tell,” he said.
In a rare criticism of the platform he runs, Mosseri explained that while Instagram does “good work” of identifying AI content, all social platforms will “get worse at it over time as AI gets better.” He implored users to pay attention to who is behind an account—is it a trusted source?
“We need to surface much more context about the accounts sharing content so people can make informed decisions,” he wrote. And creators have to show authenticity by “being real, transparent, and consistent,” Mosseri said. He ended his post by acknowledging that Instagram will “have to evolve…and fast.”
So what do we get from this? That as long as creators stay authentic—and post photos that aren’t as polished as millennials are accustomed to seeing when they open Instagram—they can maintain trust? It’s not so simple as that, and commenters on the post were quick to remind the IG head that much of this dilemma has been propelled by the very company he works for.
“Why not take responsibility for the mess you and @zuck have created instead of acting surprised or like you’re a passenger along for the ride?” one user commented. “You control this platform. You can put safeguards in place to weed out AI, protect children, and return IG to its supposed intended purpose of connecting people—but instead you choose to prioritize ad revenue and an algorithm that gets folks to stay on for as long as possible even though you have the data in hand that shows social media is bad for everyone’s mental health. Be an adult and take ownership of what this platform is now, you did this.”
Another commenter who echoed the desire to see Instagram take some responsibility for its decisions and algorithm changes singled out those that have resulted in losses for small businesses. “This year’s algorithm changes have gone much further than a shift in reach. They have directly affected people’s livelihoods. Established businesses have seen income drop, work dry up, and years of audience-building undermined almost overnight. This is not theoretical or short-term. For many, the impact is ongoing and deeply destabilizing,” the user wrote.
Some tech publications have also been distrusting of Mosseri’s statements. “He’s a corporate drone, with little personality or passion about anything in particular, and no matter what, he’s going to tow the company line, and say whatever is best for Instagram and/or Meta, no matter what,” wrote Andrew Hutchinson in Social Media Today. “Mosseri’s trying to justify the influx of AI content, rather than working to protect creators and give users more choice about what they see in the app…. More people creating with AI is better for Meta, so Mosseri’s basically waving the white flag and saying that creators are going to have to get better at producing original content if they want to keep up with AI fakes.”
For content creators, the general points from all this discussion are: (1) Be consistently raw and transparent; (2) Don’t use AI; (3) AI will eventually copy your aesthetic anyway, so maybe use it?; (4) Of utmost importance, be original.
Originality really does matter when it comes to social media—and for the same reason that jewelry brands expertly hone their aesthetic and design work to iconic status. “You know what AI-generated content succeeds? Content with a good concept, a human-originated idea that forms the kernel of the depiction,” wrote Hutchinson.
“AI tools can’t come up with human ideas, which remains the key differentiator, and AI tools can’t develop the same relationship with an audience as the top online creators.”
This is difficult to do, and that’s putting it mildly. As we get to work in 2026, originality, transparency, and, above all, being 100% human might be the keys to success. Or, in an alternate universe, will 2026 be the year we all finally leave Instagram? Somehow I doubt that.
(Photo: Getty Images)
- Subscribe to the JCK News Daily
- Subscribe to the JCK Special Report
- Follow JCK on Instagram: @jckmagazine
- Follow JCK on X: @jckmagazine
- Follow JCK on Facebook: @jckmagazine



