In the previous post I’ve had a look at how I can integrate an LLM into my programming environment and use a prompt to produce and modify code, to find bugs and security issues, and to discuss options and fix issues. I find the result stunning. So is AI assisted coding a good or a bad thing? Maybe this is the wrong question to ask, it’s like wondering if programming in Python is a good or bad thing compared to programming in assembly language. Let’s dwell on this a bit.
Layers of Abstractions
Let me start with the following assumption: AI assisted coding is just the latest abstraction layer on an already high heap of abstraction layers that allows us to further increase complexity, functionality, development speed or reduce the number of people working on code. One can have a combination of these, but from what I can tell, putting another abstraction layer on top has rarely, if ever, led to fewer jobs in the computing industry. But let’s have a look at the abstractions that we have accumulated in the past 70 years in computing. If you work in the industry today, not only limited to programming, you should at least have a basic idea of all of the following layers of abstractions. The further up in the chain you go, the more you should know about that abstraction layer to be able to understand what is going on and to make solid decisions in this industry:
- Understanding the concept of electricity
- Understanding of transistors
- Understanding of logic gates
- How logic gates can be combined for arithmetic operations, memory, etc.
- How a CPU works
- What is machine language and why an assembler was put on top.
- How languages like C work that are close to the hardware. This implies understanding of stacks, heaps, buffers and pointers, and how and why data structure overflows can ruin your day.
- Understanding of higher layer languages that abstract things like stacks and heaps away and puts data into abstract data structures that are not prone anymore to overflows and inadvertent code execution.
- Understanding of operating system principles and services they provide to programs. This includes storage, files, pipes, multitasking, threading, preemption, memory management, etc., etc.
- Understanding of graphical user interfaces
- Understanding algorithms how to store, search and retrieve data effectively.
- Understanding of database principles, how they work, how data is stored and retrieved and how to read and write data to them securely (e.g. how to prevent SQL injections)
- Understand networks and protocols, effects of packet loss, delay, jitter, etc.
- Understand concepts such as virtualization, containerization, Docker, Kubernetes, etc.
- If you are working with others on software, you need to understand git, versioning, central repositories, build chains, deployment, testing, DevOps, security implications and privacy implications.
One could probably add many more things to this list, but I think the point is clear: One needs to know much more than just the highest abstraction layer to know what one is doing, to be able to find and fix issues and to be effective in one’s everyday work. And from my point of view it is the same with AI assisted coding. Just knowing how to put requests in a natural language into a prompt might get you to something that works rather quickly, but without understanding of what happens below that layer you have no chance of being in control, shape a product, guide the evolution process and fix bottlenecks.
Every abstraction layer that was put on top in the last decades has allowed complexity and functionality to significantly increase. If we were still at the assembly language level abstraction today, there is no way we could have the computing hardware and software we use today, there would just not be enough man power to produce, fix and maintain the code. I see the same thing happening with AI assisted coding. Some companies might try to reduce the number of people working in IT, but they will fall behind those that just keep having the man power to use the tools to increase functionality faster.
Fundamental Change?
Do I think AI assisted programming will fundamentally change the way we produce software? Yes by all means, like so many other technologies have done before. Assembly language was a great abstraction from feeding 0’s and 1’s with physical switches into a computer. The C programming language offered a great abstraction of things that one could do with assembly language. Operating systems offer a great layer of abstraction for programs so they don’t have to interact with the hardware themselves. That is a huge time saver. Higher layer languages like Python and Java script resulted in an order of magnitude of speed improvement over languages closer to the hardware. Also, introducing more abstraction makes learning things easier. I would argue that learning Python is a lot easier than learning assembly language. So yes, AI assisted programming does the same again on the next level and while the change will be dramatic, we have seen similar changes in the past.
What Does it Mean For Me?
Personally, I can use the extra speed to do things I wouldn’t have done before simply because I did not have enough time. And on top, I can spend more time specifically writing better and more secure code and have better documentation of the product on top of it, so maintainability and use in the future is much simpler.
What Does it Mean For Software Development Teams?
And a final thought and link for today of what AI assisted coding means for the development process when teams of people are involved: It will speed up code creation but none of the other tasks involved in the process. In many projects, and I agree with the article I linked to above, creating the code never was the bottleneck in the first place.