[ad_1]
Two of the biggest forces in two highly intertwined technology ecosystems — big established companies and startups — took a break from counting their money to… We demand solidarity that the government To stop and desist from thinking about regulations that might affect their financial interests, or as they like to call it, innovation.
“Our two companies may not agree on everything, but that’s not about our differences,” wrote this group of widely disparate viewpoints and interests: A16Z co-founders Marc Andreessen and Ben Horowitz, Microsoft CEO Satya Nadella and President/CEO Brad Smith. A truly multifaceted pool, representing both big companies and big money.
But these are the little people they’re supposed to be looking for. That is, all companies that would have been affected by the latest attempt at regulatory overreach: SB 1047.
Imagine being accused of indecent open exposure! A16Z General Partner Anjni Madha Shoot it A “regressive tax” on startups and “blatant regulatory capture” by big tech companies that, unlike Medha and his poor colleagues, can afford the lawyers needed to comply.
Except it was all disinformation spread by Andreessen Horowitz and other financial interests who may have actually been influenced as backers of multi-billion dollar foundations. In fact, small models and startups were only slightly affected because the proposed law specifically protected them.
Oddly enough, the kind of purposeful carve-out of “Little Tech” that Horowitz and Andreessen routinely advocate for has been distorted and belittled by the lobbying campaign they and others waged against SB 1047. (The architect of this bill, the California state senator, said, Scott Weiner talked about this whole thing recently over at Disrupt.)
The bill had its problems, but opposition greatly overstated the costs of compliance and failed to support claims that it would discourage or significantly burden startups.
It’s part of the well-established playbook by which big tech companies — which, despite their positions, Andreessen and Horowitz are closely allied with — operate at the state level, where they can win (as with SB 1047) and at the same time demand solutions Federalism that you know it will do. That will never come, or that will not have teeth due to partisan bickering and congressional incompetence on technical issues.
This joint statement about “policy opportunity” is the final part of the play: After torpedoing SB 1047, they can say they did so solely with the goal of supporting federal policy. Never mind that we’re still waiting for a federal privacy law that tech companies have pushed for a decade while fighting state bills.
What policies do they support? “A variety of responsible, market-based approaches,” or in other words: Hands off our money, Uncle Sam.
Regulation should include a “science and standards-based approach that recognizes regulatory frameworks that focus on the application and misuse of technology,” and should “focus on the risk of bad actors misusing AI.” What this means is that we should not have proactive regulation, but instead reactive penalties when criminals use unregulated products for criminal purposes. This approach has worked great for the whole FTX situation, so I can see why they adopted it.
“Regulation should only be implemented if its benefits outweigh its costs.” It would take thousands of words to break down all the ways this idea is expressed in a funny way in this context. But essentially what they are proposing is to include the fox on the chicken coop planning committee.
Regulators should “allow developers and startups flexibility in choosing AI models to use wherever they build solutions and not tilt the playing field in favor of any one platform.” The implication is that there is some sort of plan to request permission to use one form or another. Since this is not the case, this is a straw man.
Here’s the big idea that I have to quote in full:
The right to learn: Copyright law is designed to promote the progress of science and useful arts by extending protection to publishers and authors to encourage them to provide new works and knowledge to the public, but not at the expense of the public’s right to learn from these works. Copyright law should not be chosen to imply that machines should be prevented from using data – which forms the basis of artificial intelligence – to learn in the same way that humans do. Unprotected knowledge and facts, regardless of whether they are on protected subject matter, should remain free and accessible.
To be clear, the explicit assertion here is that software, run by multibillion-dollar companies, has a “right” to access any data because it must be able to learn from it “in the same way that people learn.”
Firstly, no. These systems are not like humans; They produce data that mimics human output in their training data. They are complex statistical projection programs with a natural language interface. They have no more “right” to any document or fact than Excel.
Second, the idea that “facts” — by which they mean “intellectual property” — is the only thing these regimes care about and that some sort of fact-hoarding cabal is out to prevent them, is an engineered idea we’ve seen before. Perplexity invoked the “facts belong to everyone” argument in its public response to being sued for allegedly plagiarizing curriculum content, and its CEO Aravind Srinivas repeated this fallacy to me onstage at Disrupt, as if they were being sued for knowing information as trivial as the distance from the Earth to the moon.
While this is not the place to embark on a full accounting of this straw man argument, let me simply point out that although the facts Being truly free agents, the way they are created – for example, through original reporting and scientific research – entails real costs. That’s why copyright and patent systems exist: not to prevent the widespread sharing and use of intellectual property, but to stimulate its creation by ensuring that real value can be assigned to it.
Copyright law is far from perfect, and is probably abused as much as it is used. But it is not “chosen to imply that machines should be prevented from using data” – rather, it is implemented to ensure that bad actors do not circumvent the value systems we have built around intellectual property.
And that is the quite obvious question: allowing the systems we own, manage, and profit from freely to use the valuable outputs of others without compensation. To be fair, this is the “human-like” part, because humans are the ones who design, direct, and deploy these systems, and those humans don’t want to pay for anything they don’t need, nor do they want to pay for it. We don’t want regulations to change that.
There are plenty of other recommendations in this little policy document, which were undoubtedly presented in more detail in the versions I sent directly to legislators and regulators through formal lobbying channels.
Some of the ideas are good, if a bit self-serving: “Fund digital literacy programs that help people understand how to use AI tools to create and access information.” good! Naturally, authors invest heavily in these tools. Support “open data commons – collections of accessible data that will be managed for the benefit of the public.” great! “Study procurement practices to enable more startups to sell technology to the government.” amazing!
But these more general, positive recommendations are something we see every year from industry: investing in public resources and accelerating government processes. These palatable, but unimportant, suggestions are only a vehicle for the more important suggestions I mentioned above.
Ben Horowitz, Brad Smith, Marc Andreessen, and Satya Nadella want the government to step back from regulating this lucrative new development, let industry decide which regulations are worth trading off, and eliminate copyright in a way that more or less works as a general rule. Amnesty for illegal or unethical practices that many suspect have enabled the rapid rise of artificial intelligence. These are the policies that matter to them, whether children have access to digital literacy or not.
[ad_2]