Thoughts on AI, Intelligence, and Knowledge

# Philosophy # Society # Ai # Self Improvement

I have written extensively about AI in the past. On my tech blog you can find posts like Vibe management, The commoditization of AI, Living side-by-side with an AI, ChatGPT, AI, and the future of tech; On this blog, I recently posted an article titled In the name of progress. And most of these articles are somewhat negative towards AI.

Now, I’m not gonna lie and tell you that I never use AI, and all of it is bad. I do use it. Instead of Google search; in order to do product comparisons; plan trips and brainstorm ideas; check for facts (that sometimes I cross-reference with other sources). But I never used AI for coding. Sure, I used AI to ask for help like “how do I sort an array in Rust”, or “how to split a string in Ruby”, and sometimes even requested pieces of code. You know, the usual stuff, the same stuff one would Google for before AI, but now will land you on StackOverflow question with 52 people arguing what is the best over-engineered way to sort an array in JavaScript.

But I never let AI write code for me. I was in the camp of “if you don’t do it yourself, you will lose this ability”. And recently, my friend told me that all is lost, because I confessed to him that I now started to let AI write code for me, while I take the wheel of reviewing the code, and steering the AI towards the accepted conventions, and optimal solutions. And… I like it. But I still have this internal debate between the angel and the devil who each whispers into my ears two different ideas. And as always, when I feel lost or confused, I write about it, because writing is magical.

What does it mean to do something from scratch?

One common argument that I see online, and have expressed myself, against AI — is that you no longer create things from scratch. If you let AI write the code, are you really a programmer then? This question has been asked throughout history in all different variations. It was used as an argument against the use of high level languages — are you really a programmer if you don’t know how binary sort works, but instead you use whataver_array.sort()? Are you really a programmer if you copy string by assignment, a = b, instead of manually allocating the needed memory, and terminating it with \0? Do you really know what does it mean to program a computer to do something if you never xord a CPU register?

I saw a video posted on HackerNews recently, where a guy created a floppy disk from scratch. One of the comments went along the lines of “cool project, but is it really from scratch?”, implying that the author did not obtain the needed chemicals himself, mined the necessary metal himself, or created the plastic material needed to print the housing. This argument can be applied in many different aspects in life: do you really build your own house if you did not bake your own bricks, and cut your own lumber? Does a home-made dress, really home-made unless you planted and harvested all the cotton, and produced the sewing tools yourself? Where does one draw the line of what does it mean to do something from scratch?

What does it mean to be intelligent?

Another common argument in favor, or against, of AI, is the argument of intelligence. When I do something and I fail, I resort to reasoning and understanding my actions with an attempt to improve my next iterations. Contrary to what most people believe, we learn not by reading books, but by doing things, failing, and iterating. Reading books without doing is acquiring knowledge. Applying said knowledge is the definition of intelligence. And one’s intelligence is built on the intelligence of others. A cook who makes a tasty dish does not necessarily know how to grow or harvest the needed ingredients for said dish, yet he is still intelligent. Therefor, using AI in order to produce a working code, does not disqualify me from being an intelligent being.

Or maybe it does? Where do we draw the line? Going back to the chef example, while he does not grow and harvest the ingredients, he most likely has knowledge of their tastes and, through trial and error, has found a combination of ingredients that produce a good tasting dish. He could have asked ChatGPT to create a “tasty dish”. The result would, most likely, be bad. And hence, he would iterate: “I want it more sour”. And so on. Does this constitute as iteration in the sense of an intelligent being? Thomas Edison has iterated for a thousand times before creating the light bulb. With each iteration he came closer to a working product. What if instead he would prompt LLM to create a light bulb with a thousand iterations, would it qualify him to be an intelligent inventor of the light bulb?

Moreover, people tend to over-romanticize the meaning of the word “creation”. If you go to a bunch of restaurants, you will, eventually, notice that most dishes are the same. Sure, some restaurants might bring their own touch (sometimes bad), but in general, a hamburger is more or less the same in most places. Similar thing can be said about cars for examples. Most cars look the same. Cars like VW Transporter, Ford Transit, Mercedes-Benz Vito, and Peugeot Expert — look more or less the same. When people claim to be creating a software, they tend to overlook the fact that a good chunk of the code in there was not written by them, but rather introduced as third-party dependencies, or algorithms from Wikipedia / blogs / StackOverflow. And then we go deeper and deeper the rabbit hole, until eventually we ask “did you really create something if you had to rely on the works of hundreds of thousands of people throughout thousands of years of progress”?

Does this mean that all these examples lack intelligence? Just because you take a successful burger recipe, or a car platform / chassis and replicate it, does this mean you did not create anything new?

The lost knowledge

There is a very interesting talk from 6 years ago titled Preventing the Collapse of Civilization, in which the author brings examples of technologies from Byzantine Empire and the Bronze Age — which are now lost. Lost, because nobody documented how they were built, and the people who built them did not pass that knowledge down. The talk then goes into discussing how something similar might happen in tech. A common example people like to bring is the COBOL language. COBOL is still used heavily in the banking system, yet the number of COBOL developers is declining. What will happen when there will be no more COBOL developers to maintain existing systems written in COBOL? This will become lost knowledge.

The same argument is applied for AI. As we move higher and higher in abstraction levels, we lose the ability to understand the lower levels, and eventually, knowledge will fade from existence as people will have no incentive to acquire it. Why would one learn the various sorting algorithms, and how they work, if LLMs will be capable to offer the best solution for a particular use case? How many developers truly know what sorting algorithm is used when they call .sort() in their preferred programming language?

Too far-fetched for you? There is a confirmed decline in people, especially among Gen-Z, who can write by hand (Oxford Learning, Wired). As we move more towards digital life, the need to write by hand declines. It is possible that in the near future handwriting will become lost knowledge.

But does it even matter?

When I was in my mid-twenties, my friend and I used to discuss the following question: If I take you and teleport you back a thousand, or two thousand, years, what would you be able to teach the people of that time from our modern knowledge? It’s an interesting though experiment in order to understand what knowledge is. You could say: Ha, its easy, I just print the entire Wikipedia and go back in time, and then I can teach them everything we know. But how would you talk to them? I bring a dictionary. But is this your knowledge then, or are you just a vessel? With the same luck, I can teleport a dictionary and a printed version of Wikipedia, and let the ancient people figure it out themselves.

Any knowledge we possess always stands on knowledge of someone else. I know how to write a program in Rust or JavaScript, yet I don’t know how to build a CPU that will execute this program. A person who knows how to build a CPU, might not know how to obtain the needed materials for it, and so on.

My negative view towards LLMs was motivated by my moral superiority of “well, at least I know how to actually write a software”, which is in part true, but in part a self-deception. I just defined what does it mean to know how to write a software, while taking for granted everything else outside my definition such as how build a CPU, how it works, etc. Just like the guy with the floppy disk who defined “from scratch” as taking material and a 3D printer, but obviously there were people for whom “from scratch” means taking (or building?) a pickaxe, and go mine your own minerals.

But in the end, does it really matter? There is a theory I’ve heard about that everything in life in cyclic. The reason we engage in wars, despite the fact that we have documented horrors of wars; the reason we fall to the same old scams which are adapted to modern times; and the reason that old styles and technologies are coming back — is because everything is cyclic. We are not really capable to learn from past experiences, but instead each generation has to take the same path, and make the same mistakes. We oscillate from liberalism to oppression, from freedom to lack thereof — because despite the fact that we have knowledge of how bad (or goods) things have been, we do not have the experience of it. And then there are people who are caught in-between transitions. People who have experienced the bad, the good, and are on a path to experience the bad once again, not understanding how it is even possible that those damn young people ignore it, because back in my days…

In the end, it doesn’t really matter. Every opinion we have for or against any technology, is just that, an opinion. People take a stance according to their moral views and beliefs about the world. Such as myself. I guess I could say that I believe in meritocracy, despite the fact that I know that society is not built on merit. But it does not prevent me from believing that a good software engineer is someone who writes code by hand and understands the meaning behind every statement. And it’s just my world view, with boundaries that I’ve created to justify it. Boundaries that say that it’s not important to understand how the hardware works, or the chemistry behind it.

The only constant thing in life is change. Everything is bound to change. Even documented history can be shaped to whatever narrative you want. Knowledge, especially in digital age, rarely gets lost. Handwriting techniques are still there, every Gen-Z can go and learn them, they just don’t need it. The same way we no longer use Papirus to write, or wax seals to signify the authenticity of a document, because we found easier ways to do it.

I guess what I am trying to say is that gate-keeping a certain aspect of our society has no benefits. The reason Gen-Z don’t write by hand is that they have no need for it. It’s the aristocracy all over. People who have comfortable life tend to sit and discuss morals and ethics, while the common-folk try to survive day by day. Difference being that in the old time, aristocracy has been, mainly, passed from generation to generation. Children of wealthy people learned to read and write because their parents knew how to read and write, thus becoming aristocrats. Nowadays, knowledge is mostly available to everyone. Sure, using AI might dumb-down your critical thinking skills, but nothing prevents you from writing code by hand in your spare time in order to keep your mind sharp. I still read books, despite the fact that the general trend seems to indicate a decline in book reading.

And I can go on and on about such examples. We lost the art of fixing our electronics or cars, for the sake of outsourcing the labor to someone else, be it a car mechanic or a factory in China, and this made us less capable. We lost the art of growing and harvesting our food, because we have the farmers and their machinery to take care of that, and this made us less capable. We have surrounded ourselves with warmth and comfort, thus prolonging our lives, and this made us less capable and resistant to the harsh environment.

And I guess the topic of “X is making use dumber / replaces thinking / etc” has been discussed from the first day we gained the ability to philosophize and think. The only way to prevent X from making use dumber, is to go back to hunter-gatherer tribes whose sole goal in life is surviving yet another day, and reproduction.