
TOON: A Fresh, Token-Smart Alternative to JSON for the LLM
TOON is a compact, human-readable data format designed to reduce token usage and lower costs in AI and LLM workflows.
Meta introduces Vibes, an AI video feed in the Meta AI app, letting users create, remix, and share short-form AI-generated videos.

Meta just officially launched Vibes, a short-form video feed full of AI-generated content in its Meta AI app and on meta.ai. It’s part of Meta’s big bet that creative AI will become a new frontier for social media.
Let’s peel back what’s going on — what it promises, where it might go wrong, and what this means for creators (and viewers) like you.
Here’s what Meta says — plus what the press is reporting — about how Vibes works and what you can actually do with it.
Short-form AI Video Feed
Every video in Vibes is (at least partly) generated or remixed via AI. As you scroll, you’ll see clips from creators and communities — not just human uploads.
Create, Remix, Personalize
You can:
Share & Cross-Post
Once done, your video can be posted inside Vibes, sent via DM, or cross-posted to Instagram / Facebook Stories & Reels. If you see a Vibes clip elsewhere (say, Instagram), you can tap to remix it in the Meta AI app.
Feed Personalization
Over time, Meta’s algorithm will try to show you more of what you “like.” The more you interact, the more tailored your Vibes feed becomes.
Early Partnerships & Models
In early versions, Meta is working with external AI image/video creators like Midjourney and Black Forest Labs. It’s also continuing to build its own AI models behind the scenes.
Preview / Beta Phase
Meta frames Vibes as an “early preview” — meaning features may shift, and user feedback will shape future updates.
To help people understand what “AI video” might look like, Meta released some demo content:
These are whimsical, visually intriguing, but often abstract — more aesthetic experiments than storytelling blocks.
Lower barrier to video creation
You don’t need to shoot or edit a physical video. Prompt + remix = your video.
Creative experimentation & remix culture
Want to take someone else’s idea and twist it? Vibes encourages you to play with existing media.
Cross-platform reach
Because of cross-posting, your AI video could live beyond the Meta AI app (Instagram, Facebook, etc.).
Strengthening Meta’s AI ecosystem
Vibes ties deeper into Meta’s strategy: AI, social media, hardware (smart glasses) can interplay.
Early mover advantage
If AI video becomes a thing, Meta positioning itself now is strategic.

As cool as it sounds, Vibes isn’t without its shadows. A few red flags and tensions:
“AI Slop” & Quality Issues
Many users are already calling the output “slop” — glitchy, uncanny, lacking coherence.
“gang nobody wants this”
“Bro’s posting ai slop on his own app”
“What…?”
The biggest issue: visual fascination can only go so far. Without narrative depth or meaning, these videos risk being gimmicks.
Authenticity vs. Synthetic Content
Meta previously pushed creators to emphasize “authentic storytelling” over shallow replicable content. Now it’s launching a feed full of synthetic visuals — a tension many have pointed out.
Content Moderation & Misinformation
AI video can be misused — hallucinated scenes, altered reality, deepfakes. How will Meta moderate or ensure veracity?
Flooding Feeds with “Crap Content”
Because creating is cheap, the quantity of AI content might outpace the quality. This might crowd out more meaningful human-made work. Some commentators already warn of the “junk” problem.
To understand Vibes, we must see the broader chessboard. Here’s what’s in play:
Meta’s AI Reorg & Ambitions
In June 2025, Meta reorganized its AI work under a division called Superintelligence Labs following challenges with the Llama 4 model and key departures. Vibes is part of that broader push to shift from back-end to front-end, visible AI products.
Competition in the AI + Social Space
OpenAI, Google, and others are racing to embed AI into user experiences. Meta might see Vibes as a differentiator (or even necessity) to stay relevant.
Monetization Potential
If AI videos become ad units or branded, this could open new revenue streams. Meta already sees potential in image/video ad tools.
Integration with Hardware & Vision
Meta’s ambitions around smart glasses / AR / mixed reality suggest that future video capture + AI remixing could become seamless. Vibes gives a content surface for those future hardware tie-ins.
Gathering Data & Feedback Loop
By releasing early, Meta can see what users like, what fails, and iterate. It’s a strategy of build → learn → adapt.

Given Meta’s broader AI ambitions — reorganizing under Superintelligence Labs, pushing AR / glasses, and competing in the generative AI space — Vibes is more than a toy. It’s a signal.
But the question remains: will people engage with a feed full of synthetic visuals? Can it sustain beyond novelty? And how will Meta control misinformation, hallucinations, or low-quality content?
Continue exploring these related topics

TOON is a compact, human-readable data format designed to reduce token usage and lower costs in AI and LLM workflows.

Meet 1X Neo, the first real humanoid robot built for homes and workspaces. It walks, learns, and helps like a human. Preorders open now, shipping in 2026.

Sam Altman’s Merge Labs is pioneering non-invasive brain-computer interfaces using ultrasound and AI—ushering in the next era of neurotech innovation.