The arrogance of the monoculture: on teaching kids

7 minute read

Today we live in the world of the digital boom. The nerds that had a passion of playing with these beige boxes now are leaders and in power positions: they manage companies, they have huge salaries, they live a good life.

And they turned arrogant.

I’m not saying this with any angst, if anything it’s sadness. I’m saying this with the perspective of something normal that however requires awareness to be understood and balanced. It’s natural for a person that achieved success with certain skills to think that the same could happen to others. Everyone could do that, it’s so easy. But it’s of course a bias. It’s the usual problem: when you have an hammer in your hand everything looks like a nail. And if that hammer worked marvels for you and the people around you, your industry and the world economy as a whole, you feel strongly about its power and its effectiveness. It’s a completely natural bias, but we should try to look through it because, still, it’s a bias.

Take the article Kids Can’t Use Computers… And This Is Why It Should Worry You. I don’t want to pick on Marc Scott, his argument is actually one of the most moderate you can find around, I enjoyed it and it’s well grounded as well. Also, I agree on his central argument that “kid’s can’t use computers”. I just disagree on the context and the conclusion. Since it’s one of the best articles out there, I’ll take it as the starting point between many.

I’ll limit myself just to the article and here follows a list of all the things you should know to fit the definition of “can use a computer”:

  1. Understand wifi and know how to connect to a protected wifi network
  2. Understand proxy servers and how to use them
  3. Understand ports and addresses (since using a proxy requires that)
  4. Understand what does it mean to have filtering proxy
  5. Understand all the ways you can embed something in PowerPoint and choose the right one for your needs
  6. Understand the concept of download vs streaming
  7. Understand what an antivirus is
  8. Understand what a virus is
  9. Understand what is “CPU usage”
  10. Understand what is “memory usage”
  11. Understand the difference between disk and RAM
  12. Understand how software installation works
  13. Understand what an operating system is and how to set it up
  14. Understand that a “computer” might be built on different pieces instead of just one
  15. Understand that you have multiple ways to switch on wifi (wtf manufactures)
  16. Understand that the connection could come in different ways (not just wifi)
  17. Understand what a backup is
  18. Understand what syncronization is
  19. Understand the concepts of desktop and icons
  20. Understand advertising and misleading advertising
  21. Understand software versioning and know the differences between versions
  22. Understand the counter-intuitive notion that repeating an action with more energy doesn’t change a thing on a hang system
  23. Understand how to setup user profiles
  24. Understand how a virus infection works and how to clean it
  25. Understand how privacy works in digital
  26. Understand how security works in digital
  27. Understand the difference between Internet, WWW, browsers and search engines

WHOA! These are all the things I have extracted just from a single article. With just the above, I could outline a 6 months course. Plus, many of the things listed above will change in 6 months time. Whoops.

I think we have a problem with the expectations of “using a computer”.

The parents seem to have some vague concept that spending hours each evening on Facebook and YouTube will impart, by some sort of cybernetic osmosis, a knowledge of PHP, HTML, JavaScript and Haskell.

At the same time, people that are familiar with how technology works think that “use a computer” means “knowing PHP, HTML, JavaScript and Haskell”.

They can use some software, particularly web-apps. They know how to use Facebook and Twitter. They can use YouTube and Pinterest. They even know how to use Word and PowerPoint and Excel. Ask them to reinstall an operating system and they’re lost. Ask them to upgrade their hard-drive or their RAM and they break out in a cold sweat. Ask them what https means and why it is important and they’ll look at you as if you’re speaking Klingon.

And that’s precisely the point. The problem is the difference in perception of what “use a computer” means.

That’s the same emotion Apple rode in the ’90. The message that “we won’t make you learn the computer, we will enable you to publish and forget about the computer”.

Why should I care about it if all I want is to have a video chat with my mom?

Since decades of technocrats weren’t able to make things completely usable for a human being that doesn’t belong to that elite, it means that human beings need to adapt because the elite can’t be wrong: do you see all the success we have?.

Excuse me, but: WTF.

I thought we were past this narrow mindset years ago.
Well, luckily we are, years ago it was way worse. It just moved a bit down the hill, coated in “make your kid’s life better” sugar.

The obvious parallelism that is done in this case is about the car industry: you can drive a car but you don’t know how it works. That’s a good example, and already tells us that no, we shouldn’t be required to know all these details in order to use a computer. I’m sure people in the ’50 thought that everyone should learn mechanics because mechanics was the future, available to everyone, a great industry. Exactly as much as people today say that about code.

But another perspective we can take is: time. Most of the people with a driving license (most) are able to “use a car”. I’m sure that all these people wouldn’t be able to do the same with a Fort Model T in 1908. Yes maybe they will be able to do something, but at the first obstacle – the equivalent of “the wifi doesn’t connect” – they will be lost.

Take for example a very simple fact: “use a computer” even just 40 years ago meant knowing assembly, pretty much. In 1992 when I had my first Compaq Prolinea 4/33 it meant to know how to launch apps with MS-DOS and organize files in folders. I learned assembly for fun, surely not because it was needed. Today “use a computer” means being able to connect a wifi network and embed a video in PowerPoint. Just 40 years and we aren’t even talking about the same thing.

As an italian I can say the same thing about food and cooking vs how the average american or british thinks about it. Being able to feed yourself for me is very important, and I could argue it’s probably of the most useful form of knowledge you could have. Balancing a good diet with the right ingredients is a critical knowledge, and the average italian has this knowledge. The average american can cook to the extent there’s a delivery service available. And be aware, I’m not a chef, and I’m below the average in terms of food and cooking, but I know the basics.

The argument is exactly the same: it’s “easy” to connect wifi. And it’s “easy” to prepare an omelette with cheese. Regardless, I might even think there are more people able to connect a wifi than people able to cook an omelette.

So are you against teaching how to code?

No. Make no mistake: I’m not saying that’s not true that kids don’t know how to use computers, or that it’s not useful to know how to code. I’m saying that’s an arrogant perspective trying to force everyone to “reach that basic level of knowedge”, where basic apparently is somewhere between “connect wifi” and “learn Haskell”.

I should think the same thing will one day be said about the ability to drive. There will still be the auto-mobile geeks out there that’ll build kit cars and spend days down the track honing their driving skills, while the rest of us sit back and relax as Google ferries us to and from work in closeted little bubbles.

Yes, we all should.

We should keep suggesting to people to try writing code. We should keep introducing more and more people to computers and programming, as we introduce everyone to maths or history. And we might even add both learning a second or third language and learning to code in the same updated curricula.

In the end, the whole concept of structured education for everyone – another thing we take as granted but is very new when put in perspective – should still evolve and I see no reason why it shouldn’t include programming.

Let’s keep the conversation balanced, let’s keep improving our school systems.

We should allow people competent at one level to create a transparent foundation for the level above, to help humanity reach new heights.

But please, lose that arrogance.
We need a diverse world, not a monoculture.

·

As a final note: this isn’t in any way different from the “designers has to code” idiocy that cyclically happens and I already dismissed in a previous article. It’s always the combined arrogance and bias of a monoculture that, for some reason, prefers everyone to think the same instead of fostering different perspectives and growing different talents.
Let’s snap out of it.

Thanks to Dario Violi, Camillo Belotti and Roberto Giardina for the discussion that supported my thinking for this article.