Topics

Latest

AI

Amazon

Article image

Image Credits:Jaap Arriens/NurPhoto / Getty Images

Apps

Biotech & Health

Climate

Cloud Computing

Commerce

Crypto

go-ahead

EVs

Fintech

Fundraising

Gadgets

Gaming

Google

Government & Policy

Hardware

Instagram

Layoffs

Media & Entertainment

Meta

Microsoft

Privacy

Robotics

Security

Social

Space

startup

TikTok

Transportation

Venture

More from TechCrunch

Events

Startup Battlefield

StrictlyVC

newssheet

Podcasts

Videos

Partner Content

TechCrunch Brand Studio

Crunchboard

meet Us

Microsoft plans to let Teams substance abuser clone their voices so they can have their speech sound - alikes speak to others in meetings in different language .

At Microsoft Ignite 2024 on Tuesday , the troupe revealed Interpreter in Teams , a tool for Microsoft Teams that deliver “ real - clip , language - to - spoken language ” interpretation capableness . Starting in early 2025 , people using Teams for coming together will be capable to use Interpreter to simulate their vocalisation in up to nine languages : English , French , German , Italian , Nipponese , Korean , Portuguese , Mandarin Chinese , and Spanish .

“ Imagine being able to go just like you in a different language , ” Microsoft CMO Jared Spataro wrote in ablog postshared with TechCrunch . “ The Interpreter factor in Teams provides existent - metre delivery - to - spoken communication translation during meetings , and you may opt to have it assume your speaking voice for a more personal and engaging experience . ”

“ Interpreter is design to reduplicate the loudspeaker system ’s substance as faithfully as potential without tot supposition or extraneous info , ” a Microsoft interpreter told TechCrunch . “ Voice pretence can only be enabled when user leave consent via a notification during the group meeting or by enable ‘ Voice simulation consent ’ in scene . ”

A number of firms have developed technical school to digitally mimic part that sound reasonably natural .   Meta recentlysaidthat it ’s fly a displacement puppet that can automatically transform voices in Instagram Reels , while ElevenLabs offer a full-bodied platform for multilingual speech contemporaries .

AI translation tend to beless lexically richthan those from human interpreters , and AI translator often struggle to accurately convey colloquialisms , analogies and ethnical nuances . Yet , the toll savings are attractive enough to make the trade - off deserving it for some . concord toMarkets and Markets , the sphere for natural language processing engineering , including translation technologies , could be worth $ 35.1 billion by 2026 .

Join us at TechCrunch Sessions: AI

Exhibit at TechCrunch Sessions: AI

AI clone also pose security challenges , however .

Deepfakes havespread like wildfireacross social medium , hold it harder to distinguish truth from disinformation . So far this year , deepfakes featuringPresident Joe Biden , Taylor Swift , andVice President Kamala Harrishave squeeze up millions of views and reshares . Deepfakes have also been used to target mortal , for example byimpersonating eff one . Losses linked to impersonation scams topped $ 1 billion last year , perthe FTC .

Just this yr , a team of cybercriminalsreportedlystaged a Teams meeting with a fellowship ’s blow - level staff that was so convincing that the target companionship wire $ 25 million to the criminals .

In part due to the endangerment ( and optic ) , OpenAI earlier this yr decided against releasing its voice cloning tech , Voice Engine .

From what ’s been revealed so far , Interpreter in Teams is a comparatively minute diligence of voice cloning . Still , that does n’t mean the creature will be good from abuse . One can imagine a bad actor eat Interpreter a misleading recording — for example , someone asking for bank account info — to get a version in the nomenclature of their target .

Hopefully , we ’ll get a good idea of the safeguards Microsoft will supply around Interpreter in the months to come .