Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's something to be said about growing up using devices that are made for creation and not just consumption - the barrier to go from "this is a cool web page! How can I make one?" is a hell of a lot higher on a phone than a computer, or now vs. the mid-90's (if only because the standards are so much higher).

But this guy is just being a jerk: "Digital natives my ass, all they do is stream videos on YouTube and Twitch."

Which is a shame, because it crowds out what I was hoping would be a more thoughtful discussion about devices created purely for consumption, or perhaps walled gardens, and how they don't afford people the same opportunity to learn as general purpose computers.

I wonder if early car enthusiasts felt the same about people who learned to drive when you no longer needed to know how to fix your car all the time.



I think you're applying your incomplete idea of what it means to create. Now phones/tablets let people say 'that video is cool, let's make one.' Same with pictures or other digital art. When I look at the creator options available in my pocket, I'm amazed.

If anything, back in the 90s having to learn to create web pages was a barrier to creation. Video? Forget about it. Digital cameras were just coming around, but were still super expensive. I have some old pics taken on my flip phone from the early 2000s and they are...really bad. Instead I had to spend hundreds of dollars are completely separate device to be creative. And that device still didn't do video.


> Now phones/tablets let people say 'that video is cool, let's make one.'

The problem is that outside of certain subjects, video is a time-inefficient and low-information-density medium. So many things could be conveyed in fuller form, and in a way more respectful of people’s time, by using text. Unfortunately, the use of phones as default devices by many people today young and old discourages longform text, due to the limitations of tiny touch keyboards and screens. That is my own feeling of unease with the way tech is today.

In some niche travel scenes, I have seen a big decline in meaty, useful information since the early millennium, when a blogger generally had a laptop to type on. Sure, now we get glamorous video content, but that can’t be all there is.


As a counterpoint, I love video content. When I’m learning something, I’d much rather watch a talk on the topic rather than read an article on it, if I can. Especially when being first introduced to a topic, I will always reach for videos when they’re available. Conference talks by the authors involved are the best, if they’re good speakers.

I’m not sure why I prefer video content. I find my attention relaxes more when watching a talk than it does while reading. Like, I find it more effortful to concentrate on a long article in comparison to watching a talk. (Though usually 1.5x+ speed is a necessity). Also, good talks are generally better written than the average article, and good talks are easier to find than good articles are. Project readmes and webpages will say useless things like “This is our database in Go, here’s how you install it. It’s fast and reliable!”. But a good talk by the author will give me the background and the story. “So, here’s why Postgres wasn’t working for us for this problem we have at Netflix. We really wanted to solve our problem this way. So then we built this new thing to do that and it’s working great! Let me show you the weird parts we’re really proud of…”.

Text is great for reference. But I could listen to technology stories like that - telling the “why” of what people have made all day long.


I am the total opposite. The best way for me to learn is picking the useful text pieces I need right now and use them. Watching a video my brain does not actually focus on every detail, it drifts off


And let me add: I have been reading all my life and while I can skim through even a long text to decide if it's worthy of a longer read, this process is way harder with video.


Even if reading about new topics is harder than watching videos about new topics, it says nothing about the relative efficiency of learning from text or video.


The problem is that the ability to create that functionality - a robust video editor, let alone the sophisticated image processing- is poorly suited to the device itself. Art is intrinsically valuable, and I love shooting on my camera and my phone. But it doesn't really translate into a level of literacy with desktop systems to create whatever the next generation of computing might be.


I bet that the number (or even percentage) of people building those tools has increased too.


Creating high quality video or photos still poses a far higher barrier than simple text. The acquisition hardware (camera) is a trivial step, it's everything else that involves a huge amount of work.


It's a trivial step now. At no point did I say it wasn't a lot of work. But instead of having to create a web page to show off my creative works, I can focus on the work I want to do - take/edit/share pictures.

I just get annoyed when people call these devices consumption only. I could say the same thing about computers when many people only play games on them. These devices are what the person makes of them.


I agree with you, as a parent i witness how my child uses his technology. I did think to mention that we might have biases, as technologists our world view is dominated by technology. My child has a ipad and i have a few computers and game consoles in house. By observing me my child sees how i use technology.


I agree with you personally. For me, it's easy to gather my thoughts into sentences and paragraphs, and I'd rather learn Markdown and Hugo (or LaTeX!) than edit a video.

But that isn't some kind of law of nature - it isn't true for everybody. There's a significant group of people who find it much easier to use video or voice and powerful tools have arisen to help them.

For example, the TikTok video editor gets people started quickly and lets them learn as they go. I'd really encourage you to watch a tutorial to see how easy and surprisingly powerful it is.


I also think its much faster to get information back out of text than it is to watch a video.


But isn't the topic being discussed technology? Being a video creator or painter using Krita is being an artist, not a tech person.

There is no doubt that modern technology has enabled people to realise a lot of things they woul dneve rhave been able to, such as creating a movie using tools available in your pocket.

However, the original post was about learning technology. I.e. understanding computers. The difference between being a mechanic and someone using a car to drive to the grocery store.

The problem is that that the term "digital native" implies that the person understands how computers work. One might argue that the term actually should mean something else, but I argue that would make it an even less useful term. Do we call people who has drivers licenses "motorised vehicle natives"?


Matwood - This post speaks to how we often look at the 90s with rose-colored lenses and often see the computers as better. As much as I'm a collector of older machines and the tech of the era, I can also admit that things were more simplistic, slow, and with plenty of barrier-to-entry. Simple example, the first iPhone did not even have video capability. Right before the 2008 release, nobody was walking around with a video camera in their pocket. You'd had to have carried around a clunky and very expensive camcorder and nobody was just taking videos, anywhere, on a short whim. Even cameras just a few years before that were just terrible. Nostalgia all that we will, the tech was terrible. ASCII warez, and all that stuff was awesome. BBS was awesome, but there is just far more for kids today than what we had.


Another way to look at it is that smartphones and social media let powerful political lobby groups and corporations get ideas and advertising even more directly into people's minds, so that they go forth and replicate them. Who actually starts the majority of trends and what accounts are the most followed? Is it individuals with no other motive than share their creativity? Even in music and art, the mainstream is full of more or less subtle manufactured content. This is propaganda and we're having more of it than ever before.

And no, I'm not blaming the kids who grew up with this and can't know any better (yet). It's the adults who are wrong. They're old enough to know better, but they're content with their positions, they're lazy and they're scared of taking risks by speaking up.


Yeah the creative possibilities have definitely increased. It’s a cycle though, the early podcasters and vloggers look at today’s crowd and can’t believe how easy they have it...”we literally had to run a cli tool to download our podcasts!”, “there was no YouTube let alone iPhones!”


Interestingly enough there still is a high barrier to entry. Nobody wants grainy mobile pictures or low audio quality mobile clips. You'd still need a proper camera and a semi strong computer to cut something that fits modern standards


Oh come on, let's be honest. Open up Instagram and look at what people "create", most of it is memes and image macros. Not even comparable to, as an example, the first years of Deviantart (not what it has become today).


> "this is a cool web page! How can I make one?"

I sympathize with what's been lost, but what happened was people realized they didn't actually want to build a web page, they wanted to converse on a forum and share pics, and be entertained. Sure, you could publish anything you can imagine that's less than 4 MB on Geocities, but I'd rather have Wikipedia.


It's more like "That's a cool YouTube career, how do I make one?"


(Perceived) fame and fortune are pretty standard aspirations.


Fame and fortune aren’t the only reasons people try to make careers out of content; some (maybe even most) are just trying to support themselves by doing things they enjoy instead of sitting in a cubicle from 9-5 so they can afford to do those things on the weekend.

I find it pretty inspiring that we live in a time where someone who really loves doing, like, aerial yoga can reasonably make doing aerial yoga their job just by learning how to edit videos and engaging with other people who are also into that thing.


It's interesting that streamers and "influencers" seem to be replacing, or at least competing with, other pop culture and performing arts celebrities.

Though there doesn't seem to be a huge difference in practice between "youtuber" and "television presenter" other than the broadcast medium. One thing that youtube adds to television is an easy way to interact directly with viewers.


Yeah, I grew up in the same era as the article author, and I'm honestly much happier about the state of access to resources these days.

The modern "digital native" that I categorize is the legions of young people getting started in programming the vast number of rich environments that exist today.

Yes I remember the times upgrading my 486DX-33 to 8Mb of RAM, installing slackware from floppies, figuring out whatever dark magic was needed to run Duke Nukem 3D over IPX networks simulated over dialup, and heady and critically important discussions about the role of Amigas, coveting the barmaid in LoRD on the high school BBS, running unauthorized star wars MUDs on the high school computer network and getting dragged in front of school boards because you ran "nethack" once (in fairness it was likely also the pornography being accessed and distributed leveraging the school T1 lines as well but in my defence I was 14).

It was a time and a place, and there's certainly value in reminiscing. But these new kids will create their own time and place, and demanding that they somehow pay homage to my experience as being "more authentic" seems a bit pleading.

And to be honest we downplay the limitations of our times. Free and accessible development environments? Not on the standard desktop PC operating system (until DJGPP came along anyway, and even that had severe limitations for a while for building native windows apps). If you were a super nerd who went out of your way and downloaded an obscure little free unix system developed by some finnish guy named after a peanuts character, you got a compiler. And then if you scrounged around random docfiles spread out across dozens and dozens of random howtos, you could sort of learn how to program C, or python or perl.

I look at today's technical environment, available to the entry level student in some technical field, and the available breadth and depth of tooling, the amount of documentation and tutorials, all available for free, and it's amazing. If you're someone looking to build something interesting, there are a 100 different more opportunities to do that today using accessible tooling than there ever was when I was a kid.


i don't think the author is disputing that, of course, tech today is better than it was before. what he is saying is, that having learned to use computers as the computers themselves developed, he got a different, better, insight into how they work and what potential they have, whereas todays youth use computers as a black box, not caring how they work inside, and thus not learning how to use them to their potential.


I grew up at the tail end of the era in the article. Windows 95 was well established, and while CAT construction games still came with controllers to strap over the keyboard or I had to learn how to troubleshoot the game port to get SimCopter to work, I never had to learn IRQs or use dial-up.

And I admit having a sense that I am not as knowledgeable as I could be about using these insanely powerful devices to their fullest potential. The demoscene can make amazing things happen with teensy amounts of code. Amazing!

So I started watching the Ben Eater making your own breadboard computer series, and am now following hia 6502 series so I can eventually convert my model M to a self-container battery powered DIY Alphasmart word-processor-to-SD-card contraption, because why not?

So I too agree that the tiktokers are the kids these days for now, but I more heartily agree that some of them will discover something new and amazing on that there YouTube. Maybe they'll learn something we've forgotten and leave all of us in their dust with what they do with it!


You must have missed the completely available and included with MS-DOS 5 or later QBasic. Pretty powerful for how easy to use it was. Delorie's tool was nice, but nothing compared to Borland's or Wacom's.

Then again, bit earlier you had included BASIC on every microcomputer, and hardware documented way better than random PCs glued together with duck tape.


Hi, author here.

First; I must say I was surprised that my little rant made it to HN, but obviously some found it interesting, and it did spark a discussion.

Secondly: Yes, I do come of as a jerk. English is not my native language, and my Norwegian dry humour may not have translated as well as I had hoped (which is not an excuse). Also, at an age of 38, I do not really have that much experience with what todays youth does, with exception of kids between 5 to 10. And that is not a fair comparison anyways.

I have not yet read all the comments here, because I fear some of them might be a bit personal, while my "attack" was mostly a generational one. As I said earlier, this is the first time I gotten a real audience on my blog, and I was not prepared at all for this kind of reaction.

But I value your feedback, and I see your point. Next time I will try to look a bit past the "rough" language and see topics from different angles. Because the world I grew up in is gone, and we are still making progress, so obviously things are going the right way.


probably the language. You guys aren't the digital natives, you are more like the digital pioneers, you weren't born into digital land like today's digital natives. you came over on an analog boat so to speak, and with your axe, you built a digital country.

its the kids who grew up in digital land that we should consider the digital natives.

2 cents from the peanut gallery!


I think the line about streaming is less of the author being a jerk and more of the author resetting the dialogue concerning the average tech know-how of Gen Z relative to previous generations. Of course for every IT geek in the author’s generation, I’m sure there are at least two in Gen Z, which the author could have acknowledged.


Author here.

You are right. My intention was not to come of as a jerk, nor did I expect anyone to read my blog post, to be honest.

My point, which kind of got lost in the end, was to show of the big difference between my generation and the next.

I have heard so many people praising today's kids as technical wizards because they master the gesture/finger based interface so much better then the generation before me. The term "digital natives" I got wrong, big time.

In hindsight I could have spent some more time polishing my idea before publishing.


I think this is because you are thinking as a programmer/something.

Folks now have web page editors in their phones/tablets. Drawing apps, etc.

When I was 16, yeah, I want to make a webpage about WWE/F, so I had to spend days learning table based layouts and how to ftp files to a server (and find one).

Now I (my son) can open his phone and he has various tools to do so. He can share/post pictures of someone doing a full nelson and write about it, and get it up on the web in minutes vs days.

Just because he didn't have to learn 45 different incantations at the command line, doesn't mean he isn't creative, he just doesn't need to slog through programming to do it. (He is learning python though)

Think about it as frameworks, when we started, we needed to learn it all to make any kind of decent page, now most folks here would use bootstrap/material ui/whatever, or react/rails vs writing all the css by hand, fighting with all the problems it brings, configure perl scripts to be able to send an email from it, etc etc.


True - I am thinking of it as a programmer. And I wasn't entirely clear. You can create lots of wonderful things on tablets, phones, locked-down computers, etc. but it relies on someone else making the tools. On my computer I can use a program, and then make a program just like it. On a tablet I can use an app, but if I want to create an app I need to use a computer. It's one layer of abstraction removed.

The comments here have been a helpful reminder that creativity is very much alive and well! I do worry about how many people would think "This app is cool! I want to learn how to make them" but stop because the device for consumption is different from the device for creation.

Really dating myself here but the very first programs I used were ones from a magazine I typed in to a little TRS-80 pocket computer from Radio Shack myself (I'm not 100, really, I just started young and had an encouraging grandpa), and in the process realized "hmmm I could change this line and get a different output, cool!". I think that's about the lowest possible barrier between consumption and creation.

On the other hand, I learned very, very slowly compared to now, in large part because it meant getting a magazine or reading a book, and having nobody to ask when things weren't clear. On the whole it probably is easier, now that I think about it, but only if you're motivated. Maybe that's OK - people who want to can still learn to rebuild a transmission, even if most drivers don't.


A few things that would help this issue 1) a mobile-first IDE - app and responsive website (that you can share links to) with a great interface for mobile that makes use of strengths like touch screen, while getting round the weaknesses of screen size and lack of physical keyboard by not relying on typing out full words and not using the convention of having entire files open in a tab. (eg a single function or code block on screen at a time). Preferably with accounts you can use to log in from whichever device suits at any given point 2) games and other apps that are made to be "modded" that link to that IDE - modding is a big way that PC gamers get into coding these days, it's the art of making add-ons to games - if this where a common thing with mobile games and apps you wouldn't just have the PC gamer crowd modding 3) If programming and development communities where easier to find from a more diverse set of social networks, more people who enjoy expressing creativity with people, would find out that they can create this way as well


Not sure what a TRS-80 is as I don't think we had them here, but I had an old Spectrum and did the same :)

But there are tools on mobile/web to create those things also. I've been researching and trying to get something that is intuitive to work on tables to program web apps (it started web based, but now moving it to mobile as I think it matches the visual paradigm better).

There are various tools for kids to learn to program creating games (Scratch is probably the most well known) but other visual programming apps also.


Agree. Plus, anyone who has gotten into streaming knows that there is a learning curve which introduces the streamer to all sorts of hardware and software. The best streamers often invent specialized solutions for their needs.


I am hopeful we will see a creative renaissance in technology in our children. You can do more than consume with your phone today. Our children have ipads which are hybrid computers at a young age and are learning to access information more readily. They are general knowledge whizzes and learning to do more than just consuming. The software is better and with cloud services the average child will probably never see a data center or build a PC however they most definitely will use airtable and probably build an app in the cloud. The world for them will have opportunities at higher levels of abstraction. At least i hope that is the case.


>I wonder if early car enthusiasts felt the same about people who learned to drive when you no longer needed to know how to fix your car all the time.

I don't think you need to go back to early car enthusiasts. A lot of people would argue that at least being able to do basic car maintenance is a life skill everyone should have. I don't necessarily agree but I do think it's useful to understand at least the basics of how a car operates.


The car is a black box. Gas goes in, vroom comes out. If the box stops working, I pay someone to fix it.

You might argue I'm not car literate, but I'm not sure that is particularly insightful, as the type of specialized knowledge is not actually relevant to doing almost anything with a car that people want to do.

Similarly, while coding is great and all, I'm not sure that not being able to code is the same thing as not knowing how to use a computer effectively.


> The car is a black box. Gas goes in, vroom comes out. If the box stops working, I pay someone to fix it.

It would be a huge problem if you're an aspiring mechanic or anything related, but that's an increasingly niche role so we aren't worried about kids not having mechanical literacy. Many kids are aspiring to work in roles where computer literacy is vital and becoming more so every day. The one's that lack the fluency to do things like automate the repetitive parts of their jobs will gradually not be adding enough value to be gainfully employed.

Not being computer literate will have a huge impact on their careers, not being car literate won't (for most).


I won't argue. And I grew up at a time when cars were less reliable (and my first clunker certainly wasn't). I do think some basic things like changing a tire, jumping a battery, and at least knowing how to check fluid levels is useful. You don't always have something go wrong where you can easily just call for help.


Yeah except new cars you don't have access to the battery (they are starting to make the engine 'protected') and even changing a tire is more rare as many manufacturers are shipping a repair kit vs a spare wheel. Most also have roadside assistance (or your insurance has) which makes it easier.

Most people want a car to go from A to B, many times, if every few years (I only had a blown tire in 20 years driving) they have to call someone to help them, it makes much more sense than spend the time to learn how to do it themselves. Same with computers, they don't want to know how the bytes get into ram, they want to play Doom or write a school paper. There is no need for them to learn all that.


In such cases though I always have a nagging suspicion they are fixing things I don't need or gold plating the service. It definitely happens to some people.

So, I always feel better with a small amount of understanding to arm the BS detector.


Totally agreed. I'm going to be a lot less charitable than you, because to me this is some pretty obvious gatekeeping and I don't have patience for it.

This attitude was around for a while when Linux started getting easier to use (I feel like it's gotten a bit better, but maybe I'm just a part of better communities now). It's the same attitude that came up occasionally in game dev around Unity. It's pretty predictable and pretty tiresome. I don't think the author is a bad person or that they hate kids, but I think this kind of attitude is something that shows up regularly in technical communities and it's worth forcefully stamping out.

This article as it's written isn't interested in education, it's purely inwardly focused on describing how hard the author had it growing up, and how rewarding it was, and how great they turned out, and how everyone else who didn't have that same experience is a poser. It's purely designed to put younger generations down and denigrate them rather than reach out to them in any kind of thoughtful or meaningful way.

The author isn't proposing any solutions. Forget solutions, they're not even identifying problems. There's nothing of substance in this post other than bragging. No mention of how proprietary hardware incentivizes lock-in. No mention of how laws have changed. No mention of how software gets written today and how our toolkits affect accessibiliity. No mention of the rise of SAAS and how that affects people's ability to modify the programs they run. No mention of education challenges. Just nothing at all.

The only reason this blog post exists is because the author is mad that some kids are getting more attention than they did. And while it's worth talking about increasing barriers to creation, the author doesn't seem to be equipped to do so, and their targets of ire (streamers and content producers, arguably some of the more technically involved youth communities out there today) are poorly chosen.

> "If anybody is a digital native, it is me. I did not just grow up with computers, I grew up alongside them."

We get it, you're very smart. I'm super proud of you for installing Windows from a floppy disk. Do you want a medal? Should we all clap for you? Round up and scoff at the people who didn't appreciate your generation enough?

Notice what this article never says. It never says that a reduced hacker ethos in younger generations is a problem. Its primary concern is not that younger kids aren't engaged enough with technology, or that they're not hacking their devices. The primary complaint this article raises is that younger kids are called digital natives, a title that the article is concerned they don't deserve.


You wrote a critique of this blog post that's actually longer than the blog post itself, but you failed to notice the obvious joking tone the author used, so the whole thing sailed right over your head. He's clearly just doing a humorous "get off my lawn" bit about the term "Digital Native". Relax.


There's nothing wrong with consumption devices, but they are not computers. You are using a computer if and only if you are programming, otherwise you are using a glorified television or a glorified typewriter. There's nothing wrong with that. The vast majority of people want exactly that. We shouldn't fight it, nor we should believe we are better or smarter.

But that's not tech literacy.


> You are using a computer if and only if you are programming

Maybe my definition of programming is quite narrow, but I wouldn't consider using CAD software, spreadsheets, or (physics, space, etc) simulation software as programming, yet I would consider the ability to use them a degree of computing or tech literacy. I would certainly consider them very far from only being a consumption device.


The thing is that computers are useful for so many things that I find even just attempting to define what is "tech literacy" difficult and bound to endless discussion, and well the result is this blog post and the following debate here.

It's like you would like to claim "desktop literacy" for the noble tasks that can be performed on a (e.g. wooden) desktop, traditionally it being let's say copying evangelical scriptures, and are angry against those ignorant young folks that don't give a fuck about the bible but find writing novels cool, and let's not even talk about the peasants that merely use their "desktops" to cook and eat (and in the background you have a pen maker who listens to the copyist rant, with a small smirk)

At this point the vague "tech literacy" term is not useful anymore, and the problem is just that more precise terminology is needed to communicate efficiently.


All of those are heavily algorithmic in nature, so understanding algorithmic complexity makes it easier to understand how and why the software handles the way it does.

For example, joining together many separate 3D objects is extremely slow if one just selects all and does an union, since the algorithm needs to check all pairs of objects for overlaps in 3D space.

That may not be immediately apparent to non-programmers, but to programmers, it will feel like an intuitive consequence of the problem and the inevitabilities of its solution.


How many people are at the intersection of (a) being able to competently use CAD software or specialized software for physics or engineering and (b) being unable to write a simple script?

Literacy isn't about reading all the time, it's about being able to read if you want/need to. Same with computer literacy.


Essentially all the older non-software (i.e. electronic, mechanical, aerospace) engineers I know.


> There's nothing wrong with consumption devices, but they are not computers. You are using a computer if and only if you are programming, otherwise you are using a glorified television or a glorified typewriter.

Looks at all the creation-related tools on his iPad and iPhone, none of which involve programming

You sure about that?


That's.... a bizarre frame I don't think I've ever heard before.

Does this mean that my laptop undergoes some sort of miraculous transformation when I type :wq and open my browser?

What if I'm watching video in the background while coding - does my laptop then enter some superposition-state?


I don't see how this is strange, I think I failed to explain myself.

The defining feature of books is that you can read them, and books are useful because they have that property. You are literate if and only if you can read a book, regardless of whether you are reading a book at the moment.

The defining feature of computers is that they are arbitrarily programmable machines, and computers are useful because they have that property. You are computer-literate if and only if you can program a computer, regardless of whether you are programming a computer at the moment.


Wait, hold on though. You just jumped from reading to writing.

You are literate if you can read an arbitrarily chosen book. You are an author if you can write a book. Similarly, you are computer-literate if you can accomplish tasks in an arbitrarily chosen program and maintain a system. You are a computer-author if you can write a program.

We use literacy to describe understanding, not creation. What you're claiming is the equivalent of saying that modern generations can't be book-literate unless they're writing fan-fiction.

I would argue tech-literacy means understanding the common language of tech/UX today and being able to use common technology. Someone is tech-literate if they can "read" technology and are comfortable with common interface conventions, terminology, system maintenance, etc... Anyone who sets up a streaming platform using OBS, who manages a community using admin tools on tech platforms, who figures out how to tune games that they're playing to accommodate recording without dropping frames or de-syncing audio, who sets up microphones and figures out how to balance audio using mixers, who edits the result and uploads it to Youtube -- they would fall very squarely into that category of literacy. They clearly know how to "read" software, even complicated programs like video editing tools.


A book is more or less only useful for reading. Well I suppose I can stack some books if I want to elevate something on my desk but pretty much.

Whereas a computer can do many different things. Yes, it's because it's an arbitrarily programmable machine but if I choose to use software written by others rather than programming it myself I'm not sure why that's a lower use.

In fact, I can program but rarely do so. Usually I'm using a computer to do tasks like writing, working on photos, etc. My day to day use is sort of irrelevant to the fact that I can do some programming. So I guess I'm not computer-literate.


So if someone who can't read picks up a book, it's actually not a book...?


But isn't it though? Does someone literate in film need to make movies? Why can't someone be literate in tech without writing html?


It is the simulacrum of tech literacy.

Just like an Office 365 subscription is a simulacrum of a horde of calculators (people) with mass-produced calculators and typists on mass-produced typewriters, which were simulacra of noblemen scholars, pens (mass-produced simulacra of quills) and papers (mass-produced simulacra of expensive parchment)… which themselves were simulacra of prehistoric humans making cave paintings.

So do you really feel the need to tabulate, calculate and write today?


What an odd form of gatekeeping. To what end?


I am specifically not attempting to 'gate-keep'. You don't want to code? Don't. I'm not going to force it down your throat.


You're gatekeeping who qualifies as "using a computer", and using a really odd definition to do it.

Now, the rest of your point is, who cares? Do with that thing whatever floats your boat, it's fine if you don't program. That point is valid. But when "that thing" is a computer, and they're "doing" something with it, that qualifies as "using a computer". At least by the definitions the rest of us are using.


That's just a debate on the semantics of the sentence "using a computer". Maybe your definition is better, I don't particularly care.

The point I'm trying to make is that I percieve a very clear conceptual difference between "using a computer to program" and "using a computer to do other things", and I would tend to believe an effort to improve computer literacy should attempt to point people to the former rather than the latter.

Do you believe those things to be the same? I have a strong intuition that they aren't, but I'm very willing to hear a counterpoint.

Once again, I'm specifically trying to avoid any kind of value judgment.


My counterpoint to

>I would tend to believe an effort to improve computer literacy should attempt to point people to the former rather than the latter.

is why? I mean sure. If they want to become programmers, then they need to learn how to use a computer to program but I'm not sure why that's any more about "computer literacy" than lots of other tasks.

I use computers for lots of things on a day to day basis, including many "creative" tasks, and almost none of those involve programming. Programming is a specific way that you can use a computer and it may imply deeper knowledge of the underlying system than making a video, but so what?

For that matter I could equally argue that a pure front-end developer isn't really computer literate because they maybe don't understand kernel schedulers, security model, processes, interrupts, etc. work. Oh, and how about TLBs, cache eviction policies, dynamic resource allocation, etc. at the CPU level?


Because that's the fundamental, distinctive, unique thing about computers. That's the reason we bother with computers in the first place.

There's certainly nothing wrong with using a computer for other stuff, also. Or even exclusively, not everyone needs to be a programmer, that's for sure. I would argue, however, that algorithmic thinking/basic scripting is a very important piece of human knowledge, is it really more esoteric than Latin or Greek, which are routinely taught to high-school students?

> For that matter I could equally argue that a pure front-end developer isn't really computer literate...

There are different levels of literacy. A 7-year old and a PhD in English literature are both literate, but not at the same level. Same goes for computer-related knowledge, it's a bottomless pit, like any other interesting field.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: