• Loop
  • Posts
  • What California’s 8 new AI laws mean for businesses and users

What California’s 8 new AI laws mean for businesses and users

Plus more on Qualcomm’s plans to buy Intel, why Microsoft wants to use nuclear power, and the FTC’s report into social media “surveillance”.

Image - Loop relaxing in space

Welcome to this edition of Loop!

To kick off your week, we’ve rounded-up the most important technology and AI updates that you should know about.

‏‏‎ ‎ HIGHLIGHTS ‏‏‎ ‎

  • Why Qualcomm is drafting a plan to buy Intel

  • FTC’s report into the “vast surveillance of users” on social media

  • Microsoft’s new fund for AI infrastructure that could hit $100 billion

  • … and much more

Let's jump in!

Image of Loop character reading a newspaper
Image title - Top Stories

1. Qualcomm is planning to buy Intel

Intel has been struggling in recent years, which has led to Qualcomm approaching it about a potential takeover.

It’s worth noting that no official offer has been made yet, but it would be a significant move from Qualcomm - who would benefit from Intel’s x86 processor architecture and strong market position.

That would only happen if a deal was approved by regulators, which isn’t guaranteed.

Intel was once seen as the most valuable chip company in the world, but it has faced several challenges in recent years.

For some time, the company could only make very small performance increases. This irritated companies like Apple, who needed a stronger differentiator to convince their users to upgrade to the latest Mac.

Intel also fell behind its rivals, who were moving to next-generation technologies before it could. When rivals were using the 7nm process, Intel were strugging with their own 10nm process.

Following years of stagnation from Intel, Apple cut ties with the company and announced that it would develop its own processors.

Since then, Apple has developed chips that are 11 times faster than Intel-powered Macs and revenues have grown by over 70%.

Intel is now in a difficult position and have been forced to aggressively cut costs. Just last year, their chip-making business lost over $7 billion.

The company has laid off 15,000 people, paused plans for new factories in Germany and Poland, and their new 18A process has recently failed tests.

Overall, Intel still holds close to 80% market share for processors and has a strong position. But that won’t last if their competitors continue to outpace them.

Image divider - Loop

2. All of 23andMe’s board directors have quit

The mass exodus is due to a lack of confidence in the current CEO, who failed to take the company private in July.

The genomics company has also faced a difficult few years. It has seen sales in their DNA kits fall, as public interest in DNA testing wanes.

A huge data breach in December 2023, which impacted 7 million people, certainly didn’t help matters. The company has recently agreed to pay the victims $30 million in compensation.

23andMe was once valued at $6 billion, but that has since fallen to just $170 million - crashing by 97% from the historic high.

It has never made a profit and is burning through cash so quickly that it could run out in 2025.

Image divider - Loop

3. Snapchat releases their latest AR glasses

The new generation of Spectacles have an improved display, better performance, and a longer battery life. Although, they still look weird and aren’t available for regular consumers.

Instead, the device is only being sent to software developers. It’s a strange decision. What’s the point in creating AR apps for the device if there aren’t any users?

Snapchat’s approach contrasts with Apple and Meta’s, who have instead opted to release the technology for general users.

The company has made solid improvements to hand tracking and voice control, but I can’t get away from the fact that the glasses look weird.

The frame is far too thick and it would really stand out if you tried to wear these AR glasses in public.

It’s great to see that Snapchat is trying to move the sector forward, while minimising their manufacturing costs, but I’m not sure this is the right strategy for medium-term adoption.

Image divider - Loop

4. BlackRock and Microsoft are planning a $30 billion megafund for AI

The Global AI Infrastructure Investment Partnership (GAIIP) aims to raise $30 billion initially, with the ultimate goal of reaching $100 billion.

The money will be used to invest in new data centres and expand existing sites to meet the growing demand for computing power.

There are also plans to invest in energy infrastructure, which is quickly becoming a problem for these new data centres.

It’s also believed that Microsoft are in talks with OpenAI to construct a massive supercomputer, codenamed Stargate, which could cost up to $100 billion.

Image divider - Loop

5. Microsoft signs a deal to use nuclear power

Following on from Microsoft’s investment plans, they have also signed a huge deal with Constellation Energy.

For the next 20 years, Microsoft will use nuclear power from the Three Mile Island plant in Pennsylvania. This will be used to power the company’s data centres, which require more and more energy.

That plant was shut down in 2019, but Constellation plans to spend $1.6 billion and make it operational again in 2028.

This is an unprecedented move. It reveals just how much energy will be needed to create future AI models and to provide them to businesses around the world.

But it also shows the growing reach of tech companies, as they become the only players who can truly create and deploy next-generation technologies.

For years we said that data was the new oil. Now we need to ask ourselves if these companies becoming the new Standard Oil.



Image title - Closer Look

California has approved 8 new laws for AI. But what are they?

California state building

California has already signed eight AI bills into law, which make them some of the most far-reaching in the United States.

Two of those bills have made it a criminal offence to create and spread explicit deepfakes of other people.

Social media platforms are also required to create a way for users to report these deepfakes and instantly remove them.

It’s a good move and this is perfectly achievable for the big companies to do. For example, YouTube has seen great success with their Content ID system, which would work in a similar way.

Those laws will also protect children and young adults, who are most vulnerable to these types of deepfakes.

For AI-generated images, watermarks are now required. These allow both the public and other companies to spot when something has been artificially created.

While there are issues with the watermarks technologies that are available, they are the only tool we have. Although, I must say that Google’s SynthID seems to be one of the best out there.

It’s already very difficult to identify AI images and that will only get more difficult as time goes on. This is a step in the right direction, but not the complete answer.

Three other laws focus on deepfakes that could influence US elections. Online platforms will need to label content as a deepfake or remove it, clearly show when a political ad has used AI, and stop users from posting deceptive material.

The two remaining bills are centred around California’s huge creative industry. Movie studios will need to have an actor’s permission before they can create an AI replica of them - including their voice or likeness.

This is another good move that protects actors from being exploited by movie executives, with many of those actors struggling to get by.

California has taken a common sense approach to AI and how people should be protected from it’s misuse, without actually regulating the technology and stifling innovation.

The state is outlining a blueprint that, hopefully, other governments will follow.



Image title - Announcement

FTC investigates “vast surveillance of users” on social media

FTC building

The FTC examined how data was gathered by social media companies - such as Facebook, Whatsapp, YouTube, Reddit, Amazon, and TikTok - but remained vague about how this was done, as it may give a competitive advantage to their rivals.

The agency’s report found that there was “vast surveillance of users” and that businesses had little incentive to self-regulate or protect user data.

This is information we have heard before, so it’s not a shock to anyone. Social media companies collect data about every interaction you do, or don’t do, and how long it took.

They also try to determine your current mood. For example, if you’ve looked at an ad for over 5 seconds, you’re probably in the mood to buy something.

Next time you do this on Instagram, notice how quickly it shows ads on your screen.

What’s important from this report is how it hints at future regulations. The FTC recommends that companies “strengthen protection for teens”.

Just two days before the report was published, Meta announced that it adding more protections for teenage users.

The FTC is starting to take a tougher line on social media platforms and this investigation is just the first step.

We should expect to see more regulations around data protection, allowing users to opt-out of their data being used to train AI models, and restrictions on what data can be collected about under-18s.



Image title - Byte Sized Extras

🎥 YouTube Studio allows creators to brainstorm video ideas with AI

🌖 Intuitive Machines lands $4.8 billion NASA contract to create Earth-moon communications

🔥 Google’s new sensors aim to detect wildfires much faster

🚗 Ford, BMW, and Honda's vehicle-to-grid company is live

🎬 Amazon releases a video generator for ads

🥽 Anduril is adding their software to headsets used by the US military

🕵️ US government secretly took control of a botnet that was run by Chinese government hackers

🤖 LinkedIn is using your data to train their AI models

🌲 Black Forest Labs is raising $100 million, at a $1 billion valuation

🎞️ GenAI startup Runway signs a deal with a major Hollywood studio

Image of Loop character with a cardboard box
Image title - Startup Spotlight
Cybersecurity dashboard

Picus Security

This startup was founded by three Turkish mathematicians and focuses on the cyber security industry.

The company has successfully simulated over 1 billion cyber attacks, which led to over $45 million being raised in funding.

Picus Security’s platform aims to prevent security breaches, with online attacks becoming more sophisticated and difficult for organisations to trace.

Their technology works by running constant validation and cyber attack simulations. These allow companies to fix issues within the network quicker and shows how a customer’s codebase is vulnerable to attack.

So far, they’ve been able to raise a total of $80 million and have over 500 customers - including Mastercard, Visa, and ING.

To grow the company and take advantage of the US’ huge cybersecurity workforce, they have since relocated to San Francisco.

It was then able to triple revenues in 2020, with headcount reaching 200 people in the same year.

I’ve added a link below if you want to learn more about them.



This Week’s Art

Cafe view in Edinburgh

Loop via Midjourney V6.1



Image title - End note

We’ve covered quite a bit this week, including:

  • Qualcomm’s initial plans to buy Intel

  • Why all of 23andMe’s board directors have quit

  • Snapchat’s latest AR glasses, which are only available for developers

  • BlackRock and Microsoft’s $30 billion fund for AI

  • Why a nuclear power plant is being brought online again

  • A closer look at California’s eight new laws for AI and their potential impact

  • FTC’s report into the “vast surveillance of users” on social media

  • And how Picus Security have simulated over 1 billion cyber attacks for customers

Have a good week!

Liam

Image of Loop character waving goodbye

Share with Others

If you found something interesting in this week’s edition, feel free to share this newsletter with your colleagues.

About the Author

Liam McCormick is a Senior AI Engineer and works within Kainos' Innovation team. He identifies business value in emerging technologies, implements them, and then shares these insights with others.