In today’s AI news, Mark Zuckerberg announced a huge leap in Meta Platforms’s capital spending this year to between $60 billion to $65 billion, an increase driven by artificial intelligence and a massive new data center.
Zuckerberg plans to increase the company’s capital expenditures by as much as roughly 70% over 2024.
In other advancements, Hugging Face has achieved a remarkable breakthrough in AI, introducing vision-language models that run on devices as small as smartphones while outperforming their predecessors that require massive data centers. The company’s new SmolVLM-256M model, requiring less than one gigabyte of GPU memory, surpasses the performance of its Idefics 80B model from just 17 months ago — a system 300 times larger.
And, Anthropic has launched a new feature for its “Claude” family of AI models, one that enables the models to cite and link back to sources when answering questions about uploaded documents. The new feature, appropriately dubbed “Citations,” is now available for developers through Anthropic’s API.
Meanwhile, can AI agents reliably click on all images showing motorcycles or traffic lights for us? It might be too early to tell, considering that a robot will essentially have to tell a website that it is not a robot. However, it looks like at least one of OpenAI’s Operator users was able to have the AI agent beat CAPTCHAs for him.