Unidentified Marine Animal Washes Ashore at San Francisco’s Ocean Beach
An unidentified dolphin-like fish was discovered washed up on Ocean Beach in San Francisco today, prompting a call to city animal control services.
The animal was found by a beachgoer on November 2, 2025, at approximately 11:30 AM. Attempts to identify the species using Meta’s AI assistant, accessed through smart glasses, were unsuccessful, though the AI did confirm the animal was deceased and advised against handling it. The discovery highlights the potential for citizen science, even with emerging technologies, in documenting unusual wildlife events.
The individual was able to utilize the AI assistant to locate the contact number for San Francisco Animal Care & Control. While the glasses offer features like live language translation and directional assistance, the user noted challenges with the AI’s voice activation, requiring direct eye contact and clear vocal commands. Concerns were also raised regarding the integration of Meta’s “Vibes” service within the associated app, described as a constant stream of AI-generated videos.
The incident underscores growing public interest in marine life and the importance of reporting unusual findings to authorities. For information on reporting marine mammal strandings, visit the National Oceanic and Atmospheric Administration’s website. San Francisco Animal Care & Control is currently investigating the animal’s species and cause of death; further updates will be provided as they become available.
As I wore them on one of my walks through San Francisco, on the shore of Ocean Beach, I came upon a dolphin-like fish that had washed up on the sand. Though I got my camera glasses close enough to the thing that I could smell it, Meta’s AI assistant could not tell me what kind of animal it was. It correctly identified that it was very dead and that I should not touch it. It was then able to direct me to a number to call for city animal control services.
Beyond instances like that, I tend to avoid the AI voice interaction because I haven’t gotten to the point where it feels natural. Getting it to search something is usually very quick, but doing so requires you to stop dead in your tracks, stare directly at another person’s purse or something, and say out loud, “Hey Meta. HEY META. Is this bag Gucci?”
The glasses’ AI features are both its best asset and biggest weakness. Features like live language translation and whispered map directions are very helpful. But if you’ve spent any time curating the AI slop out of your Facebook feed lately, you’ll know that Meta just can’t help pack a firehouse blast of AI features into everything it does.
The software features are funneled through the same app as Meta’s AI services. That’s where pictures and videos go by default, and sometimes you have to go into the app to import the files from the glasses. There’s a very clear problem with using the app: bad vibes.
The Vibes Are Off
When you go into the Meta AI app to look at the pictures or videos you’ve taken, the first thing you’ll see is Meta’s terrible new Vibes service; a constant barrage of AI slop videos that Meta just one day foisted upon its app users. Vibes is akin to OpenAI’s dubious Sora app, but somehow even worse quality.