Large language models like ChatGPT are only as good as the data on which they are trained, and increasingly the gatekeepers of this data are looking to capitalise on the new-found, generative artificial intelligence (AI)-fuelled demand. A recent revolt by moderators on the social forum Reddit illustrates this trend. Reddit’s plan to charge for access to its application programming interface (API)—which is a virtual firehose that spits out all content posted to Reddit in a developer-friendly format—has sparked acrimony among the Reddit community.
Tension is brewing between the steady march of AI and the longstanding spirit of the open internet, where access to information is presumed to be free and control is decentralised. To help discern the fault lines in this hastening turf war, Double Take welcomed AI ethicist and startup founder Kelvin Lwin, who was formerly a senior deep learning instructor at chip giant Nvidia. According to Lwin, open APIs have allowed platforms to spawn and scale innovations at a faster clip than they ever could have done on their own, but the advent of paywalled AI models changes that proposition.
Every platform, I think, wanted to become an ecosystem. So you want to encourage developers, right?…It’s the right intuition, but then that opens yourself (up) to these kinds of seminal moments now where it comes to a head—where developers have different incentives, the platform owners have different incentives, but everybody’s trying to make money. So then, who gets to eat?
Kelvin Lwin, founder and chief executive officer of ALIN.ai
Lwin explains that an awkward middle ground has developed, where open-source data on platforms such as Reddit is considered ‘free to use’ yet at the same time ‘not for commercial use’.
I think, generally, it’s a very inherent problem in our tech field because all of us want to help the world and contribute to the world, so open source sounds good on paper. But then, when you actually have these monetisation and business model pressures, what we usually have seen is that open-source projects themselves, if they’re not supported by some profit-making entity, they tend to die off.
Kelvin Lwin
From Lwin’s perspective, the secret sauce of a large language model is a blend of the source data that feeds the algorithm, and the method by which that data is wrangled through the model to produce an output.
Who gets to determine which recipe(s) prevail, and who if anyone could the recipe(s) enrich? Lwin suggests that these questions may be key in the development of AI.
In general, decentralisation was the ethos of technology in Silicon Valley in general, right? That’s why you get the Bitcoin and blockchain and all those decentralisation kinds of things. And then there was a joke that, how come for all the talks about decentralisation, everybody was happy with over 100 million users going and relying on one company’s API, ChatGPT, overnight?…I would argue AI is actually a trend toward more centralisation because you have all this data, sensory capacity, compute (capacity) and pattern recognition capacity. So, can you make centralised decision-making? Probably.…There’s an underlying bigger arms race going on with what is the right model to get it out to the people.
Kelvin Lwin
According to Lwin, even platforms such as Reddit that are expressly decentralised are experiencing a tilt toward centralisation of control.
It has a decentralised model in terms of the users, the content, the moderators, right?…But then from the company point of view, especially when they’re trying to monetise it or really prop up the value, then it becomes centralised, right? The CEO is making the decisions on the APIs…Then the users are going, ‘I thought we own this platform?’…‘I thought this is a place for us?’ Because they don’t have to care about making money and (being) profitable and things like that, but the CEO and the centralised, the corporate entity do. So I see it as, again, the fight between the two modes of thinking.
Kelvin Lwin
Lwin anticipates the ramifications of this showdown to be profound and pervasive.
I think we are at a critical juncture of humanity in some ways. What it means to be human, what can humans do, what can technology or AI or robots do? All this sort of stuff that was in the realm of philosophy and religion, kind of entering the public discourse…Remember, AI only learns from the example we give it.
Kelvin Lwin
Subscribe to ‘Double Take’ on your podcast app of choice or view the Reddit and the AI turf war episode page to listen in your browser.
This is a financial promotion. These opinions should not be construed as investment or other advice and are subject to change. This material is for information purposes only. This material is for professional investors only. Any reference to a specific security, country or sector should not be construed as a recommendation to buy or sell investments in those securities, countries or sectors. Please note that holdings and positioning are subject to change without notice. This article was written by members of the NIMNA investment team. ‘Newton’ and/or ‘Newton Investment Management’ is a corporate brand which refers to the following group of affiliated companies: Newton Investment Management Limited (NIM), Newton Investment Management North America LLC (NIMNA) and Newton Investment Management Japan Limited (NIMJ). NIMNA was established in 2021 and NIMJ was established in March 2023.
This material is for Australian wholesale clients only and is not intended for distribution to, nor should it be relied upon by, retail clients. This information has not been prepared to take into account the investment objectives, financial objectives or particular needs of any particular person. Before making an investment decision you should carefully consider, with or without the assistance of a financial adviser, whether such an investment strategy is appropriate in light of your particular investment needs, objectives and financial circumstances.
Newton Investment Management Limited is exempt from the requirement to hold an Australian financial services licence in respect of the financial services it provides to wholesale clients in Australia and is authorised and regulated by the Financial Conduct Authority of the UK under UK laws, which differ from Australian laws.
Newton Investment Management Limited (Newton) is authorised and regulated in the UK by the Financial Conduct Authority (FCA), 12 Endeavour Square, London, E20 1JN. Newton is providing financial services to wholesale clients in Australia in reliance on ASIC Corporations (Repeal and Transitional) Instrument 2016/396, a copy of which is on the website of the Australian Securities and Investments Commission, www.asic.gov.au. The instrument exempts entities that are authorised and regulated in the UK by the FCA, such as Newton, from the need to hold an Australian financial services license under the Corporations Act 2001 for certain financial services provided to Australian wholesale clients on certain conditions. Financial services provided by Newton are regulated by the FCA under the laws and regulatory requirements of the United Kingdom, which are different to the laws applying in Australia.
Comments