The dream of modern computing was that a user was someone who could use and modify and create programs. Somewhere we departed from that and made a "programmer/user" dichotomy. And as both computers and programming tools have become more accessible (free, easier to use, etc.), somehow that divide keeps getting worse
Computers are seen as tools only for programmers and as appliances for everyone else. IMO this is bad for society. Being able to program your computer at a basic level should be seen in the same light as basic literacy and numeracy.
@calcifer I was there when it happened, and it wasn’t an accident.
Not enough time to elaborate right now, but it’s a tragedy for sure.
@requiem @calcifer I know there are assholes and capitalist that want to (in this case, actively)alienate people so they have more control. And like, not having a basic programming language clearly visible in initial menus when you get a pre-installed computers is really bad.(ideally with basic drawing/user interaction included)
But.. people just wanting to just the computer to do things is sort-of normal? Though i kindah think the worlds should be as close as possible..
This goes for a lot of things, but often the naive-user software is not made for the people who want to tinker for it, or roadblocks are set up...
They're made very monolithic at all. Partially because inter-process communication isn't used much, isn't very clear to paople, and i.e. piping between programs is not exactly up to it..
@jasper @requiem this is very much the point. Our assumption that "users" will not want to program means designs hide programming from users. This, among other things, leads users to assume that programming is necessarily hard and complex, which makes them turn away from it
Yet when I show most users that they can use a macro or a script to automate manual steps, very few are anything other than excited to do that. That's programming!
Think macros/scripts often require something to be subdivided in a somehow neat block..
For instance bookmarking could be more separate bit. Other document readers, the filesystem could then use it better. Or some tags could cause scriptst to run for instance. (bookmarking could also allow adding a bunch of quotes from the url.... imho)
There will always be a level of programming that IS complex and that most users will find uninteresting and/or inaccessible. That's ok. The problem is that we increasingly make those barriers deliberate and discourage people from using the simple programming facilities that are available
Systems design should include the assumption that many users would want to program the system at some level, and we should make that easier and more accessible through design and education changes
@behemoth it's a question of literacy and available education. Learning to use a computer including basic programming of it isn't fundamentally harder than learning to read at an acceptable level. We just do a much much worse job at teaching it, and in the last decade the design has been increasingly hostile to it.
@behemoth being able to write a program that accepts input, manipulates it, and saves its output; and being able to use this to transform data in order to pipe it effectively between applications. Being able to automate a series of actions you normally take manually.
@calcifer @behemoth how would that help change the dichotomy in any appreciable sense? just because someone wrote a basic hello world app in school doesn't mean they have any greater appreciation for computers any more so than knowing how to say Je mappel Alex gave me any greater appreciation for French
Which is that instead of giving up on problems, they're more confident trying to solve them. Instead of doing a task in hours of cut-and-paste-and-update, they make a macro (yes, that's programming). It's not about "appreciation", it's about accessing more of the power and utility of computers
@calcifer "when I teach people basic programming skills (which is much more than "hello world")" how long do you spend doing this?
@xmakina typically it's been about a 3-month course of study, with a half day in the classroom once a week and about 3-4 hours of self-study each week. So about 96 hours of student effort with 48 being classroom
It would be easier to do this by incorporation in other tech literacy programs: what I've done is very focused work that aims to go from "I have not really used a computer" to basic competence, including macros and some simple scripting
@calcifer okay, so a pretty decent course (learning to drive in the uk is about 67 hours according to https://www.directline.com/car-cover/magazine/how-long-does-it-take-to-learn-to-drive)
For me at least, that explains where the dichotomy has come from and also why it's getting worse; the endless rush of capitalism to be constantly consuming/using/working means asking Joe Bloggs to put 100 hours into a training course (+ the cost of hiring an instructor) is just not feasible and basically impossible without putting legal punishments behind not doing it
@calcifer it also highlights how the original dream was desperately naïve and clearly conceived by the white middle class men of the time; to just assume that, yes, people can obviously have that kind of time and money just lying around with nothing else better to do with it
@xmakina you are aware that we already teach computer literacy courses to kids in publicly-funded primary and secondary education, yes? Having to take time out or spend money is not on the table here: bettering the education we're already supplying is. We already spend well over 100 hours on computer education in primary schools
I'm over here complaining that we as a society don't systematically include programming as part of computer literacy, and it sounds an awful lot like you're saying "oh, but we should keep gatekeeping programming because it's expensive to learn as an adult"
@calcifer I'm in the UK, so "we already teach computer literacy courses to kids in publicly-funded primary and secondary education" is not applicable over here. Same goes for "we as a society"; the USA is also a very different society to the UK.
I think we're approaching the conversation from such different positions it's probably best to stop here
@xmakina the UK has been including programming in its primary school curriculum since 2014, and that replaced a standard computer literacy program that started in the late 1990s.
I guess what I'm having trouble with is why you're arguing that when we teach people to use computers, we shouldn't include programming them as part of that education. What's the problem you have with that?
@calcifer although to challenge the gatekeeping statement; I'm not sure where you've pulled gatekeeping from as opposed to "not everyone can drop 100+ hours (and costs) into a skill"; it's just a statement of the material conditions for most people.
It's a question of feasibility in our current economic climate. I would love to have folks in their 30's be able to do a 100+ hour course on something that would impact their lives in a positive way; cooking, sewing, programming but that's basically unheard of
@xmakina because my whole complaint is we've set society up so that if people want to program, most currently have to wait until adulthood and invest a bunch of time and money. And my assertion is that it doesn't have to be that way, that it is that way because we don't see programming as part of computer literacy, and it would be a social good if we changed that so that people didn't have to learn it as a separate skill
And you're arguing against that. So, that seems an awful lot like you're supporting the status quo—that the gatekeeping is ok and we shouldn't work to change that.
@calcifer I think we're probably talking about different things in different situations, mostly because we're going over a huge topic and two different continents; I would like to ask that you don't take the accusative approach of telling me what I'm doing ("gatekeeping", "arguing against that") and instead try to correct where I'm discussing something different to yourself
@calcifer in that vein; let's try to narrow down the topic:
Are we talking about educating kids so that they grow up to be "the dream of modern computing", or are we talking about breaking away from "a "programmer/user" dichotomy" with people who are already adults?
@xmakina I'm talking about a cultural change wherein technologists stop treating "programming" as a thing that's only done by experts, but rather as something that should be accessible to all users. I'm talking about what the world might look like if we did that.
If we did that, we should see kids' programs for computer literacy include programming. We should see vocational training for adults including programming topics. We should see system designs encourage discovery and use of programming capabilities. We should see simplified programming interfaces that are faster to learn.
The current situation is not a reflection of the dream of modern computing, but a corruption resulting from propaganda and corporate-driven consumerism designed to gatekeep knowledge and exploit everyone but an artificial elite class
@calcifer right okay, I think I'm with you.
Programming is absolutely a separate skill to computer literacy and, in many circumstances, should only be done by experts (basically anything involving network communication). That's not a corruption or propaganda, that's an acknowledgment that this stuff is hard to get right and very easy to get very wrong (just look at how often system fuck ups still happen).
@calcifer I honestly disagree with this. There’s no more or less virtue in knowing how to program than in knowing how to bake bread, or to mend torn clothes, or to play a musical instrument. And I mean this in quite a precise way: they’re all things it’s good to know, but there are more things it’s good to know than any one person can learn, even without taking aptitude and inclination into account.
@calcifer (And when you say “a dream of modern computing” you mean “a dream of modern computer enthusiasts”. It’s worth reflecting on who that encompasses and who it doesn’t)
@ghost_bird and no; I don't mean "enthusiasts". I mean the creation of the modern computer was driven heavily by people who strongly believed that it should someday be accessible to everyone, as a piece of common infrastructure for the good of society. Corporate interests are what pulled it away from that and redefined "accessible" as "we can use it to exploit more people"
@calcifer The thing is, though, hacker culture was always a culture of young white men who made a lack of domestic skills into a badge of honour, and we’ve inherited their skewed perception of what’s important.
@ghost_bird first, there were people working on modern computing before there was "hacker culture". And no, it was not always the whole "lack of hygeine is a badge of honor" shit. That's late-90s hacker culture, which is maybe hacker culture's midpoint. And that's well after the divide I'm talking about happened
I was there for most of this, and the popular ideas of that history are really inaccurate and shaped by media and propaganda
@ghost_bird it's not a question of virtue, but of utility. Everyone should know how to wash their clothes, cook for themselves, and perform basic maintenance of their living space, because there's high personal and social utility to those things. Not in everyone being an expert, but everyone having the basics down
If you find value from using a computer, then the utility of that computer will be massively enhanced if you know how to program it at a basic level. And like basic cooking, it's not nearly as complex as people would like you to believe — I used to teach retirees this skill, and they had a basic competence after 3 months of weekend classes, and that was starting from "I have never touched a computer"
@ghost_bird @calcifer Wouldn't more basic skills and general knowledge in all of those areas over the last 100 years have mitigated some of the terrible situations we now have with food, agriculture, textiles and entertainment? Part of consumerism is a kind of learned helplessness where we give up learning how to do things and in the process concede decision-making to (often myopic & greedy) product development teams. You don't have to become Julia Child, Norm Abrams or Herbie Hancock to have a little basic literacy in an area of culture that is deeply important to our health and survival (not to mention pleasure).
@ghost_bird @praxeology it's not "or", though — that's just a Utopian Fallacy. People getting more value out of their computers rather than having to rely on corporate spoon feeding is a meaningful difference in power. No, it will not end poverty or fix all ills. It's not a panacea.
But there is no downside, and significant upside, to broader computer literacy
when I was in high school, everyone who wanted to learn got taught at least how to code in BASIC; the class was also 60-75% girls as well as boys. This was in England from 1985-1988, and it seems that level of education got lost in the 90s somewhere (they are at least trying to reintroduce it again since about 2012)
@calcifer @ghost_bird I agree, it's certainly not going to solve all problems. But if we want to have less big market dominance, we will need to make some serious changes in our material culture to be less consumerist. Home-made pickles, bookshelves and spreadsheets may be small but important parts of that.
I would add that these forms of knowledge also lead to a more informed voters when it comes to relevant topics. Witness recent debates about one or two medical topics – basic education plays a huge role.
@calcifer well I don't disagree necessarily, it feels like it should be in the same category as basic car work, appliance repair, cooking well, or any number of specialized-but-common skills.
Let's not confuse "allow anyone to code" with "everyone should code"
@astraluma I find the idea that "not everyone should code" tends to be born from the very dichotomy I'm talking about. It's rooted in the assumption that programming a computer means doing the kind of programming that professional programmers do
It doesn't have to be. Block programming, macros, basic scripting tools, etc. are things we should consider part of basic computer literacy. Not everyone needs to know C/Go/Python; but everyone would be significantly helped by being able to use blocks and macros and small scripts
It's more akin to being able to cook than being able to repair your toaster: everyone benefits from the basics, not everyone needs to be a chef.
@calcifer this is all valid points, and I agree with it.
But in the same way that I have no interest in doing car work (despite having plenty of mechanical acumen), some people just want a Facebook, email, and photos machine.
And that's fine!
I did not say "not everyone should code". I encourage computing accessibility and making it easier to do compute things, but I also recognize that not everyone will want to. These are both valid use cases, and both should be supported and encouraged.
Cars support both A-to-B drivers and people that rebuild their engines every year. Computers should do the same.
@astraluma not everyone wants to read. We recognize the social utility in trying to make sure everyone learns the skill thoroughly (even if we don't succeed in that always)
I never said we needed to force anyone to code. I'm saying we've created a culture that keeps people from using their computers to the full by making them believe that programming is something that is always complex and requires deep, arcane knowledge. That's not a choice; that's bullshit that's keeping people from making full use of the computers they have access to
I think it would help matters immensely if people were taught that simple programming tasks are actually quite approachable, and made the assumption that most people would use those facilities if they knew how, just like we assume most people will use the other basic computing facilities once they know how
@calcifer It's interesting that companies like Microsoft support initiatives that teach people to program... as long as they don't question their supremacy.
Eg. you are not supposed to see the source code of the web based teaching platform, that's for the parent company's eyes only.
@csepp yep, most programming education is indoctrination into the programming career pipeline, not taught with the idea that "you can make fuller use of your own computer by programming it"
@calcifer I was thinking about how you seldom get "manuals" with anything anymore.
I borrowed an old drill from a friend, it had a disassembly guide. It was sort of inviting you to maintain and repair it and get familiar with it.
My new drill came with a 4 pages leaflet that says you can go to url:// to download a PDF, the pdf is just a PO where you send the thing if it fails.
My new drill is easier to use, but I'll never know how it works, or how to repair it.
Your chart is ready, and can be found here:
Things may have changed since I started compiling that, and some things may have been inaccessible.
The chart will eventually be deleted, so if you'd like to keep it, make sure you download a copy.
My daughter attends an elementary school where programming (or maybe it's better described as algorithm design, or possibly computer science?) is built into the curriculum.
They start the kids in kindergarten with CodeMonkey, Jr. (https://www.codemonkey.com/). CodeMonkey, a descendent of turtles, provides a grameified interface where the user gives movement instructions to a cartoon monkey in a virtual landscape.
This is pretty far-removed from writing a script to manipulate data from a file on your computer and saving the results to file, but also lays the groundwork for a sophisticated understanding or programming concepts down the line.
I sometimes wonder a bit about which is a preferable starting point: making sure that everyone understand the basics of how to do something "real" with data, vs. introducing the fundamentals of how algorithms work in a more abstract context?
A bunch of technomancers in the fediverse. Keep it fairly clean please. This arcology is for all who wash up upon it's digital shore.