Dumbing down tech

An old school programmer points out the way in which even programmers are being taught to be glorified power users rather than actual computer engineers:

If I may be so brash, it has been my humble experience that there are two things traditionally taught in universities as a part of a computer science curriculum which many people just never really fully comprehend: pointers and recursion.

You used to start out in college with a course in data structures, with linked lists and hash tables and whatnot, with extensive use of pointers. Those courses were often used as weedout courses: they were so hard that anyone that couldn’t handle the mental challenge of a CS degree would give up, which was a good thing, because if you thought pointers are hard, wait until you try to prove things about fixed point theory.

All the kids who did great in high school writing pong games in BASIC for their Apple II would get to college, take CompSci 101, a data structures course, and when they hit the pointers business their brains would just totally explode, and the next thing you knew, they were majoring in Political Science because law school seemed like a better idea. I’ve seen all kinds of figures for drop-out rates in CS and they’re usually between 40% and 70%. The universities tend to see this as a waste; I think it’s just a necessary culling of the people who aren’t going to be happy or successful in programming careers.

The other hard course for many young CS students was the course where you learned functional programming, including recursive programming. MIT set the bar very high for these courses, creating a required course (6.001) and a textbook (Abelson & Sussman’s Structure and Interpretation of Computer Programs) which were used at dozens or even hundreds of top CS schools as the de facto introduction to computer science. (You can, and should, watch an older version of the lectures online.)

The difficulty of these courses is astonishing. In the first lecture you’ve learned pretty much all of Scheme, and you’re already being introduced to a fixed-point function that takes another function as its input. When I struggled through such a course, CSE121 at Penn, I watched as many if not most of the students just didn’t make it. The material was too hard. I wrote a long sob email to the professor saying It Just Wasn’t Fair. Somebody at Penn must have listened to me (or one of the other complainers), because that course is now taught in Java.

I wish they hadn’t listened.

The real reason the courses are being dumbed down, of course, is so that women can pass them. But they’re not only being dumbed down, they are being prettied-up and sparkle-ponied in an attempt to make the girls feel as if they’re actually able to do something meaningful. This isn’t the case, of course, but the programs are being designed in such a way that the young women won’t figure out that they’ve been sold a very expensive course in self-esteem until after they graduate and realize they can’t actually do any real programming.

This isn’t good for anyone, not for the girls who should be majoring in something else, the girls who could handle the traditional programming curriculum, or the young men who would be better off teaching themselves to program instead of paying tens of thousands of dollars to not learn the more rigorous aspects of the discipline.

I started out as a CompSci Engineering major myself. In the first semester, I realized that I didn’t enjoy the level of detail required to succeed and immediately switched to Economics. I am very, very glad that my university didn’t make the course more to my liking, as I now know that I would not have made for a good programmer, much less a great one. This isn’t a case of old school guys rhapsodizing about the good old days either, the situation is materially detrimental to practically everyone concerned except the universities and the banks profiting from the student loan system.

As an employer, I’ve seen that the 100% Java schools have started churning out quite a few CS graduates who are simply not smart enough to work as programmers on anything more sophisticated than Yet Another Java Accounting Application, although they did manage to squeak through the newly-dumbed-down coursework. These students would never survive 6.001 at MIT, or CS 323 at Yale, and frankly, that is one reason why, as an employer, a CS degree from MIT or Yale carries more weight than a CS degree from Duke, which recently went All-Java, or U. Penn, which replaced Scheme and ML with Java in trying to teach the class that nearly killed me and my friends, CSE121. Not that I don’t want to hire smart kids from Duke and Penn — I do — it’s just a lot harder for me to figure out who they are.

Universities should be making the entry STEM courses harder, not easier, but as it stands, both their financial and their PR incentives run in precisely the opposite direction.


Programmer-prostitutes #icanprogramming

In the end, that’s what the result of GRLZ CAN 2 CODE and pushing more women into pseudo-programming degrees is going to be. Using their sex to sell software to real programmers. Consider the function of the “developer evangelist”:

Developer evangelists are definitely a different
breed. You have to, on the one hand, have the technical chops to be able to
code software, and on the other hand, have the ability to talk about it. I know
a lot of people that are knee deep in their technical savvy, but when it comes
to explaining it to someone who’s never used it before, they fall short. You
need someone that can not only walk the walk, but talk the talk and communicate
it to the community.

Developer evangelists should also
be forward thinking. You need visionaries who can assess the developer
community and see how you should be steering the ship. Otherwise, the developer
program might not necessarily take off. Developer evangelists need to be community
focused. This means elevating the developer community. It means being present
and going out there and working with the developer community.

As
it happens, I was an developer evangelist back in 1990, back when Apple
first popularized the concept. The formal title on my card was
“Transdimensional Evangelist” and my job was to visit the various hardware
manufacturers and computer game developers and convince them that they should be focused
on 3D-acceleration hardware, not MPEG-decompression hardware, for their
next generation of video cards and games. I was initially unsuccessful, but as I had been telling them, the superior technology won out in the end. It may be almost impossible to imagine now, but at the time, the vast majority of the industry was convinced that accelerated 2D video was the future, because 3D was flat-shaded, processor-intensive, and ugly… never mind that one could do so much more with it.

Now, unlike
most “evangelists”, I was actively involved on the strategic development
side; as it happens, I was the individual solely responsible for a chip
designed for the CAD market also having the critical features required
for the game development community; namely, accelerated Gouraud-shading
and texture mapping. I even named the chip: the 3GA. It’s not an accident that Creative Labs didn’t hold the original trademark on “3D Blaster”. However, (and this
is the relevant part), I was under no illusion that being the industry’s
first evangelist for the inevitable move from 2D to 3D made me an engineer, much
less a chip designer.

You may recall that I’ve said one
reason women are unlikely to succeed in programming per se is because
they tend to have an allergy to being held responsible for their own work.
This is mere anecdotal evidence, not conclusive proof, but consider what
sort of “technical chops” are required for this “developer” to “walk
the walk”:

I’ve had issues where my code
didn’t necessarily compile on the first try, and it’s great, because, all
of a sudden, you see them trying to figure it out with you, and it becomes an
engaging activity, as opposed to walking through a bunch of slides.

Isn’t that great? When you can’t do your job on your own but can get someone to help you figure out how to do it? And isn’t that totally unexpected and not at all anticipated by anyone who is sufficiently familiar with the female approach to technological responsibility?

This
is not to say there isn’t a place for women in technology. Nor is there
anything wrong with saleswomen actually knowing what they are talking
about; in fact, this is actually a highly desirable development. But what is
wrong is the pretense that this is not the probable outcome of a computer science degree, or that the
evangelist, (which is a function that combines marketing and strategic
sales), is even performing a production-related job at all.

And this part cracked me up:

Right now, some of the most interesting mobile app developers I know are people who started programming just two years ago. But they’re able to plug stuff together now in such a way to make something that’s cool.

Developers who aren’t Gamma programmers and didn’t study computer science engineering at university are always the most interesting, are they not? And they must be bang-up programmers to have picked it up so quickly!


A tale of corporate torture

Any society that believes it to be vitally important to get more employees like this in the tech industry would appear to be one that is unlikely to maintain its technological advantage over the rest of the world:

In other words, her colleagues didn’t think well of her work, she was having an inappropriate and unprofessional relationship with at least one male colleague,  her presence caused the performance of another male colleague to go downhill, (possibly through no fault of her own), she pissed off the founder’s wife, spent considerable time on a project of no possible use to the company’s bottom line, spend much of her time at the office in the bathroom crying, the founder has now been “put on leave”, as has one of the engineers, and the company has inadvertently become the focus of considerable media attention.

How good does a female coder have to be to make her employment worthwhile if all that is the potential cost?

Read about the grand saga of the persecution of Miss Horvath at Alpha Game. It’s like something out of the Black Legend of the Spanish Inquisition.


Epic Gamma Fail

It’s always amusing to be lectured on feminist dogma by white-knighting gamma males desperate for female approval. A brief background:

  1. I wrote an Alpha Game post on the widely reported fact that most women who obtain computer science degrees don’t end up sticking with programming very long. I attributed this to the same reason women don’t write much hard science fiction; they are disinclined to put in the hard work required because they don’t enjoy it and it’s not a field where the usual trick of playing the “I’m just a little girl” card doesn’t get the men to do their work for them.
  2. A commenter added: “I think too many of these girls who get drafted in under the “MOAR
    GIRLS!” banner never see real work, then bail when they encounter it.
    Who will be at a technical conference debating the fine points of
    something technical, or the fine points of a pun, and who will be taking
    selfies in a mirror with a sign like “I am doing programming!”?
  3. Enter White Knight #1, who promptly took it to Twitter, encouraging male programmers to take pictures of themselves doing programming and posting them to Twitter with the hashtag #iamdoingprogramming.
  4. The gammas, sending the possibility of attracting some rare female attention, promptly committed the aforementioned epic gamma fail.
  5. One Ted Mielczarek ‏promptly declared the need for a Pink Programming Police. “We need some sort of HN-terrible-comment database that we can use as a “do not hire” blacklist.” He also threatened to never hire me. Oh, well, maybe I can find a job at Tor….

Apparently people with decades of experience in software development are simply supposed to ignore everything they have seen and heard while embracing the GIRLZ CAN 2 CODE movement. I have a better idea. Let’s take a scientific approach and simply mandate female employment on all mission-critical financial programming projects from now until the financial system collapses.

We know mass default and credit contraction is necessary to clear the system, so we may as well kill two birds with one stone.


Facebook fraud



How shocking that a thief like Zuckerberg would have built a massively fraudulent empire that rests on third worlders impersonating first worlders utilizing real products. If 1999 was tragedy, 2014 is pure financial farce.

The entire “social media” edifice is a giant variant on a Ponzi scheme. It’s Madoff economics.


Spying on Angry Birds

I don’t know about you, but I feel much safer to know that the NSA is keeping close tabs on middle-aged housewives playing Candy Crush:

The National Security Agency and its UK counterpart GCHQ have been developing capabilities to take advantage of “leaky” smartphone apps, such as the wildly popular Angry Birds game, that transmit users’ private information across the internet, according to top secret documents.

The data pouring onto communication networks from the new generation of iPhone and Android apps ranges from phone model and screen size to personal details such as age, gender and location. Some apps, the documents state, can share users’ most sensitive information such as sexual orientation – and one app recorded in the material even sends specific sexual preferences such as whether or not the user may be a swinger.

Many smartphone owners will be unaware of the full extent this information is being shared across the internet, and even the most sophisticated would be unlikely to realise that all of it is available for the spy agencies to collect.

Dozens of classified documents, provided to the Guardian by whistleblower Edward Snowden and reported in partnership with the New York Times and ProPublica, detail the NSA and GCHQ efforts to piggyback on this commercial data collection for their own purposes.

Scooping up information the apps are sending about their users allows the agencies to collect large quantities of mobile phone data from their existing mass surveillance tools – such as cable taps, or from international mobile networks – rather than solely from hacking into individual mobile handsets.

What a freaking joke. Every single employee in the NSA ought to be marched off to jail immediately. Pretty soon we’re going to learn that they’re spying on smart refrigerators too. How they rationalize this is Constitutional is beyond me. This is well beyond mere police state and deeply into farce.


30 years of Macintosh

Stephen Fry commemorates the 30th anniversary of the introduction of the Macintosh by lamenting one of the great mischances of history:

In one of the world’s most extraordinary missed meetings in industrial, commercial or any other kind of human history, a Henry Morton Stanley failed to encounter a Dr Livingston in the most dramatic and comical fashion.

In the early 90s a young British computer scientist, Tim Berners-Lee had been tasked by CERN (Centre Européeen pour la Recherche Nucléaire the now famous large hardon collider that found the Higgs Boson or a tiny thing pretending to be it) to go in and see if he could find a way of getting the Tower of Babel of different computing platforms used by the hundreds of physicists at the plant to talk to each other. He came up with something that made use of metatextual techniques that he called The Information Mine. Being a very very modest man he realised that those initials spelled out his name, TIM, so he changed it at the last minute to the World Wide Web. He wrote a language HTML (Hypertext Markup Language), a set of communication protocols (chiefly htttp — the hypertext transfer protocol) and an application, as we would now say, on which all these could run, which he called a browser.

He planned, devised, programmed and completed this most revolutionary code in Geneva on one of Steve Jobs’s black cube NeXT computers. Hugging his close to him he took the train to Paris where Jobs was going to be present at a NeXT developers’ conference. Clutching the optical disc that contained the most important computer code in history he sat at a desk while Steve marched up and down looking at hopeful programs and applications. As in all of Steve’s judgments they either sucked or were insanely great. Like a Duchess inspecting a flower show he continued along the rows sniffing and frowning until he got two away from the man who had created the code which would change everything, everything in our world. “Sorry Steve, we need to be out of here if we’re going to catch that plane,” whispered an aide into Jobs’s ear. So, with an an encouraging wave Steve left, two footsteps away from being the first man outside Cern to see the World Wide Web. The two men never met and now, since Steve’s death, never can.

Those who only know me as an inveterate Apple-hater probably don’t realize that I started out as an Apple guy. While my father built his fortune on the IBM PC, first on its need for memory cards, then on its need for high-resolution graphics, (he created and sold the first 1024×768 board for it, the ARTIST card.), my pride and joy and constant companion was an Apple //e. It was stacked, with two disk drives, a color monitor, and a 300 baud modem. I loved that machine, but I gave it up reluctantly when I went off to college and it became apparent that I was going to need something better suited to writing papers.

So, my parents gave me a Macintosh Plus, which gave me a huge advantage over other students, who had to wait their turn in the computer labs when they needed to write their papers. I had a particularly nice setup, since I lived in the only dorm with its own computer lab, complete with Macintosh computers and printers, so I could write my papers, then walk the disk down to the computer lab at 4 AM and print them out without delay. I remember, in particular, one paper on Alfred the Great that blew my professor away because it included a map of England on which I’d drawn the various extents of the Danelaw.

Not that he was unfamiliar with the Danelaw, but it was the first time he’d ever seen a printed graphic in a student paper. That was the power of the Macintosh. I don’t think I ever turned in a paper again without some visual example. In fact, looking at the two college papers I still have with me today, one on the economic development of Japan and the Soviet Union, the other on the Italian condottieri, I can see crosshatched maps of Italy and several charts very similar to those that regularly litter my economics posts. That Macintosh Plus created a habit of readily resorting to bar charts that apparently persists even today.

Where the Macintosh ultimately fell down was not in its failure to penetrate the business market. That’s the conventional wisdom, but it is wrong. Apple was never going to dislodge IBM and Microsoft there and was wise not to kill itself trying. The opportunity that Steve Jobs mysteriously missed, long before the World Wide Web, was the games market. Despite its GUI, the failure to adopt color for three years after the PS/2 introduced VGA/MCGA, as well as its reluctance to embrace a non-serious market, meant that Apple conceded the games market to DOS.

Papers be damned. The first time I saw Wing Commander, I switched immediately over to DOS and picked up a Compaq 386/25. I haven’t looked back since.

I admire the late Steve Jobs. He was an amazingly innovative corporate genius. It is deeply lamentable that his chief legacy as a technologist appears likely to be the walled garden of Apple.


A plague of fake reviews

It’s not just Amazon and the political trolls:

Sites
such as Amazon, TripAdvisor and Yelp are now the go-to destinations for
customers who want to cut through advertising waffle and discover what
products and services are really like. Yet, according to research that
is sure to panic business owners across the world, a fifth of Americans
have left online reviews for items they’ve never bought or even used. 
This figure is even higher (32 per cent) among parents with children
under 18 and the most popular reason why online shoppers questioned did
this was simply because ‘they felt like it.’

I think
“panic” is too strong a word. But I do think the online retailers need
to do a much better job of eliminating fake reviews and providing
meaningful rating systems than they do.

Although the
inaccurate headline is misleading. A fifth of all online reviews are not
fake. A fifth of Americans have, at some point, left a fake review.


Speaking Engagements

I occassionally receive requests for appearances from various organizations and conferences. While I appreciate the interest, I only accept speaking engagements in continental Europe and the United Kingdom. I do not accept speaking engagements in Asia or the Americas.

I should probably mention that I do not do book signings anywhere.


NSA backdoors the tech world

The NSA appears to have been bent on destroying the ability of American companies to export hardware as well as sell software services internationally:

They claim the performance of the company’s special computers is “unmatched” and their firewalls are the “best-in-class.” Despite these assurances, though, there is one attacker none of these products can fend off — the United States’ National Security Agency.

Specialists at the intelligence organization succeeded years ago in penetrating the company’s digital firewalls. A document viewed by SPIEGEL resembling a product catalog reveals that an NSA division called ANT has burrowed its way into nearly all the security architecture made by the major players in the industry — including American global market leader Cisco and its Chinese competitor Huawei, but also producers of mass-market goods, such as US computer-maker Dell.

These NSA agents, who specialize in secret back doors, are able to keep an eye on all levels of our digital lives — from computing centers to individual computers, and from laptops to mobile phones. For nearly every lock, ANT seems to have a key in its toolbox. And no matter what walls companies erect, the NSA’s specialists seem already to have gotten past them….

The ANT division doesn’t just manufacture surveillance hardware. It
also develops software for special tasks. The ANT developers have a
clear preference for planting their malicious code in so-called BIOS,
software located on a computer’s motherboard that is the first thing to
load when a computer is turned on. This has a number of valuable advantages: an infected PC or server
appears to be functioning normally, so the infection remains invisible
to virus protection and other security programs. And even if the hard
drive of an infected computer has been completely erased and a new
operating system is installed, the ANT malware can continue to function
and ensures that new spyware can once again be loaded onto what is
presumed to be a clean computer. The ANT developers call this
“Persistence” and believe this approach has provided them with the
possibility of permanent access.

Another program attacks the firmware in hard drives manufactured by
Western Digital, Seagate, Maxtor and Samsung, all of which, with the
exception of the latter, are American companies. Here, too, it appears
the US intelligence agency is compromising the technology and products
of American companies.

At this point, why would any European or Asian company buy products from an American company? The NSA is totally out of control and it has to be dismantled as soon as possible before the American tech sector is devastated.

And what are they doing with all this access? Stopping terrorists? Interdicting drug smugglers? Preventing bank fraud? No, they’re listening to American soldiers have phone sex with their wives back home.

The NSA has zero credibility at this point. Zero.