China tech: those who control the algorithms control the future | Financial Times
The algorithms, the “secret sauce” of many tech businesses, are just one target of China’s latest Chinese crackdown on business. Regulators want full disclosure of how tech groups crunch big data and users’ viewing histories to show them products or content.
The move is more nebulous than the recent cyber security reviews for local tech groups hoping to list overseas. Western news reports focused on the dwindling listing options of companies such as TikTok-owner ByteDance and autonomous driving start-up Pony.ai. But potential state interference in algorithms poses a bigger threat to business models.
I usually refrain from commenting on China; one because they have mostly stolen technology that has been repurposed to serve party’s interests. Second, they have become the international headache due to their expansionist policies and excessive human rights abuses. That’s why I was surprised they are willing to kneecap their own homegrown “start-ups” and curtail their algorithms.
It is not my concern if their consumer companies face internal regulation- it is my concern that they are showing a playbook to the entire world on how it’s done. One speculation is they are doing it to show an arm twisting template for American audiences, especially as the clamour to control the big-tech has grown louder within the corridors of power. The alternative explanation is to get the companies to show their subservience.
If algorithms are the “secret-sauce”, the regulator can seize them- and subject them to code review and look around for instances of “discriminatory behaviour”. It can be a slippery slope for many companies and healthcare delivery organisations, because in the current state, it is uncharted territory. Those AI implementations have not been extensively tested in the courts, nor they have been validated as “free of biases”. There will always be confusion around it in the lay public.
I hope it serves as a wake up call for countries and institutions to include a broad representation of users, as well as policy makers, to understand how algorithms are being designed and how they conform to any laid down standards (if any).