Text-to-Video?

Plus: Generating insights from meetings, scaling laws and more

Happy Friday!

Welcome to the latest edition of AINow, your go-to source for the latest in AI insights, tips, and tutorials. We are thrilled to bring you up-to-date on the latest advancements in the world of AI.

As we enter into 2023, AI continues to transform the world as we know it. From healthcare to finance, the impact of AI is being felt across all industries. With so many exciting developments happening, it can be hard to keep up! That's why AINow is here to bring you the most important news and insights from the world of AI.

Let's get into today's issue!

šŸ”§ Today's Top Tools

  • Create AI videos by simply typing in text

  • Ask any data question in plain english

  • Transcribes, takes meeting notes, and generates insights for your meetings

šŸ“Š News

ElevenLabs Pulls Open Access to Voice-Cloning Tech

AI startup ElevenLabs recently developed an extremely cool synthetic speech tool called VoiceLab which lets you train a synthetic voice from as little as 60 seconds of audio. To promote the technology, it originally had an open access service. Unfortunately, people mis-used this stuff ā€“ ā€œmalicious content was generated by free, anonymous accountsā€, the company said in a tweet thread. As a consequence, it introduced a paid tier to try and reduce misuse.

ā€œThis will keep our tools accessible while allowing us to fight potential misuse,ā€ the company said. ā€œWeā€™re tracking harmful content that gets reported to us back to the accounts it originated from and weā€™re banning those accounts for violating our policy.ā€

What Voice Lab is: Voice Lab is advertised as a system that can ā€œclone voices from samples or clone your own voiceā€¦ our cloning model learns any speech profile based on just a minute of audio, without trainingā€.

Why this matters: AI capabilities are increasingly powerful and available. These capabilities, like voice cloning, have a vast range of positive uses. Unfortunately, theyā€™re also edging into the sort of ā€˜Enemy of the Stateā€™-style capabilities that drift into the murkier parts of the world, like the work of intelligence agencies. AI means capabilities which previously required exquisitely expensive and complicated black programs are now emerging into the open as a consequence of broadly available, well understood, open research. The times, they are a changinā€™.

šŸ“Š Other News

šŸ”‘ Use Case

šŸ§  Learn

Scaling Laws - why they matter and what they mean:

Epoch, an AI research organization, has published a literature review of scaling laws in AI research. Scaling laws are a field of AI research that is strategically important ā€“ they help developers figure out how to efficiently combine the right amounts of data and compute to get a predictable level of performance out of a given class of models. Scaling laws have broadly de-risked many parts of AI research by making the process of building and refining AI systems more predictable and reliable.

Whatā€™s happened in scaling laws: The literature review highlights a couple of important takeaways:

1) Itā€™s possible to come up with basic power laws to describe a lot of AI scaling, but these power laws break at the extremes of having either a very small amount of data, or a very large amount of data ā€“ thereā€™s important work to be done in modeling when you transition from a less predictable region into a power law region.

2) Transfer learning is still hard to understand. ā€œThere is not a simple universal scaling law for transfer learning between arbitrary tasks,ā€ they write. ā€œWhen the tasks are similar enough, upstream loss and downstream performance are closely related, but when tasks are very different, the details of the architecture and hyperparameters become very relevant.ā€

Reply

or to participate.