My take on AI
January 14, 2025
1347 words
Last year, in 2024, I interacted more with LLMs than I have interacted with humans. I have tried tens of LLMs and over a hundred AI tools. It made my learning more efficient and my work faster. AI has really improved a lot in the last 1000 days, almost the same time passed since I joined university. From coding to more creative tasks like writing or image generation, it improved at an exponential rate, starting from DALLE to the latest Flux models that don’t mess up alphabets in images anymore (which was a big issue initially). ElevenLabs has done a huge part in AI voices, just with an instruction, I can make Jermaine Cole drop a fresh verse like Port Antonio (love that song), get a very realistic image of a gorilla doing deadlifts, or even create a 4K video of a UFO landing on top of a building.
All this arises a question in mind, is creativity dead? Are humans useless now? The internet pretty much believes this. Creativity, as people perceive it, merely creating something. But I believe creativity is more emotional than physical. Me writing this gives me joy and a new perspective. A kid making a pile of stones and calling it a building is not a very tedious task nor much creative, but the imagination he is having while building it and the emotions he feels while doing that is. That pile won't remain there for long, but the feeling of doing it can stay for a lifetime. A much better output can be produced by just giving an instruction to an LLM, but how much does that output worth to you? Will it ever give you that joy of creating it on your own? People who have a sense of craft crave human imperfections. The richest man always flaunts his handmade clothing. A hand loom can produce a better result, even stitches, perfect symmetry, and finish the job faster but the man still pays 10x more and waits 10x longer for human-made clothing, which will have imperfections.
Throughout the internet nowadays, I see an abundance of AI-generated content, and it’s very easy to identify it for any observer. I read LinkedIn posts of my people showcasing their projects or sharing something else. Every single post is AI-generated with emojis like 🚀 and ✨, some pretty standard emojis provided by ChatGPT. I have seen people copy an entire article and put it in ChatGPT to summarize it. AI is biased by nature because it's trained over public data. The person who wrote that article has his own bias, and by asking AI to summarize it, you lose the bias that the author had and see a different perspective of it that is pretty much useless. I do this too for technical documentation, but never for anything that will have a take on something.
Till here, you might have formed a bias that I am against AI and a boomer. No, I have used AI extensively, learned a lot about how it works, why it hallucinates, and how to improve its efficiency. A senior member of our team is an applied AI engineer, and discussions with him taught me a lot about the limitations of LLMs and why, after a time, the progress saturates. Some of the problems that AI had and were later fixed, one good example I remember is poor python code generation by ChatGPT 3.5 in its early days. LLMs at its core, is just a next-word predictor, so while generating python code, it had complications with proper indentation and generating logically sequenced statements. This was fixed with some rigorous reinforcement learning and human feedback done on the LLM with Python Abstract Syntax Grammer Tree, which improved the accuracy of python code generation significantly. In my job, I have used AI a lot and it probably writes 80% of my code. But does that mean I am useless because the majority of work was done by an LLM? Again, no. AI shines at solving leetcode problems and the majority of common dev work because it's trained on the solutions of those leetcode problems written by humans and GitHub repositories populated by humans. It's a great search engine you can say, but it's not the best, it can't pull out non-existing logic or get knowledge of latest versions of frameworks instantly. For anyone who has ever created a product end to end knows that AI can be a helper at best, not the creator. You still have to read the docs, you still have to look at the code, understand requirements, fix bugs, gather feedback, and deal with 100 more things.
Content creators as usual want to stay relevant and for that, they jump on the trendy boat even when they are not aware of it. I see many non-tech creators make reels and videos about how AI can develop a website in 5 minutes and then proceed to show us how to create a to-do app. The irony. When Devin (a software engineering agent) was released, tens of thousands of videos and posts were made claiming that software engineering is dead. Even the biggest tech creators rode the same boat at that time without personal research. After a month, a YouTube channel called "Internet of Bugs" hosted by a senior software engineer (great channel btw) created a video deep-diving into Devin and showcasing how the demo of it was faked and that it can't do things that are minutely complex. Again, after the release of this video, thousands of same creators started posting that Devin was a lie and that software engineers can't be replaced. This cycle has been ongoing for the past two years now. Last week, the CEO of Salesforce gave a statement that they will not hire any junior software engineer in 2025 because AI agents can do the same work. I highly doubt they will stay true to this commitment, as many past incidents like this have happened and in the end, they hired more than they used to (or fired). This again sparked the conversation: is software engineering dead?
From early 2024, I started to seriously deep-dive into web development. Everybody said no and mocked that web development is dead and AI already took it. Btw, even ChatGPT 4o was not released till then. Web development, for the people who actually know what it is, can never be dead. It is the biggest medium of product creation and distribution. Your product is accessible on every device in this world that has a browser, from a smartwatch to an 85-inch TV, whereas say an Android app requires specific OS versions and a lot of dependencies on the user side. I kept learning web, I leveraged AI, I built projects and I also built clones by following tutorials. This so-called 'dead' field has kept me employed for the past six months, has enabled me to earn lakhs from my job and freelancing, and I still actively learn in the same domain.
Efficient use of AI has definitely made us productive with our work. This means there will be a reduction in jobs, but not more than 50% according to me and definitely not elimination. I know overpopulation has hindered us, I know there is racism against students from private universities, but these can’t be the reasons to complain. I strongly believe that the people who are scared of an AI takeover are those who know somewhere deep in their mind that they never honed their skills to a level where they can stand tall in their shoes and talk about it, which accounts for a key part of creating something or having employability in the industry. The world, especially India, needs strong AI education about how it can be leveraged and its limitations rather than how it will replace them in some future that I can't see right now.
This writing will have gramatical mistakes, many parts could have been written better, this whole writing is biased. But thats the whole point, its written by a Human!