Cryptopolitan
2024-12-17 00:40:58

Meta makes three major updates to its Ray-Ban glasses, including live AI

Meta Platforms has added new features to its Ray-Ban glasses, including real-time live AI and Shazam on top of the already onboarded features. This comes as the social media giant has been constantly upgrading the AI-enabled glasses to enhance performance including to handle more complex tasks and respond more naturally in major updates that are expected to transform the smart glasses. Meta made significant upgrades to Ray-Ban Earlier this year Meta revealed it was integrating its next-generation AI model Llama 3 into virtual assistant – MetaAI in its Ray-Ban smart glasses to enhance performance. The multi-modal features were in early access last December and can perform translations in addition to identifying objects, animals, and monuments. Now, Meta has brought other major upgrades to the smart glasses to boost their performance further. According to Cnet, the always-on continuous AI assistance started working on the Ray-Ban glasses on Monday for owners that have access to Meta’s features . This is in addition to the onboard translation, and Shazam which is currently available in the US and Canada only. The latest features added to the AI-enabled glasses include continuous audio and camera recording as opposed to specific individual prompts. This, according to Cnet , allows the glasses to be used for an extended period of time with the AI features turned on. Whenever the always-on live AI is activated, the glasses’ LED also stays on. Meta keeps a recording of the conversation that can be referred to throughout the AI session. In terms of translation, it is expected to work automatically while talking. However, although the translation comes through the glasses, it comes with a slight delay. The live AI assistance is reportedly similar to what Google demonstrated on its own prototype glasses this month via Gemini and Android XR, arriving next year. According to Meta, as cited by Cnet, the aways-on AI takes a hit on battery life, and users can expect up to 30 minutes of use before recharging the device. However, the company further explains that this type of always-on camera-assisted AI is exactly what more tech companies will be exploring in the next year. AI glasses hit their stride in 2024 The upgrades come as Big Tech is pushing AI assistants as the raison d’etre for smart glasses. Last week, Google revealed Android XR for new smart glasses, and specifically positioned its Gemini AI assistant as the killer app. In the meantime Andrew Bosworth, Meta CTO said in blog post that “2024 was the year AI glasses hit their stride.” In the same blogpost, Bosworth further opines that smart glasses may be the best possible form factor for a “truly AI-native device.” He added the AI-powered glasses could be the first hardware category to be “completely defined by AI from the beginning.” With Meta’s Ray-Ban smart glasses, users can activate the virtual assistant on the glasses by saying “ Hey Meta ” then ask a question or prompt, before the assistant can respond via speakers built on the frames. According to Meta , users can livestream from the glasses to social media platforms like Facebook and Instagram, using “Hey Meta” to engage with the firm’s “advanced conversational assistant” MetaAI. The Ray Ban glasses feature enhanced audio and cameras, in addition to more than 150 different custom frame and lens combinations, “and they are lighter and comfortable.” Land a High-Paying Web3 Job in 90 Days: The Ultimate Roadmap

Crypto 뉴스 레터 받기
면책 조항 읽기 : 본 웹 사이트, 하이퍼 링크 사이트, 관련 응용 프로그램, 포럼, 블로그, 소셜 미디어 계정 및 기타 플랫폼 (이하 "사이트")에 제공된 모든 콘텐츠는 제 3 자 출처에서 구입 한 일반적인 정보 용입니다. 우리는 정확성과 업데이트 성을 포함하여 우리의 콘텐츠와 관련하여 어떠한 종류의 보증도하지 않습니다. 우리가 제공하는 컨텐츠의 어떤 부분도 금융 조언, 법률 자문 또는 기타 용도에 대한 귀하의 특정 신뢰를위한 다른 형태의 조언을 구성하지 않습니다. 당사 콘텐츠의 사용 또는 의존은 전적으로 귀하의 책임과 재량에 달려 있습니다. 당신은 그들에게 의존하기 전에 우리 자신의 연구를 수행하고, 검토하고, 분석하고, 검증해야합니다. 거래는 큰 손실로 이어질 수있는 매우 위험한 활동이므로 결정을 내리기 전에 재무 고문에게 문의하십시오. 본 사이트의 어떠한 콘텐츠도 모집 또는 제공을 목적으로하지 않습니다.