Skip links

Wait, security courses aren’t a requirement to graduate with a computer science degree?

Comment There’s a line in the latest plea from CISA – the US government’s cybersecurity agency – to software developers to do a better job of writing secure code that may make you spit out your coffee.

Jack Cable, a CISA senior technical advisor, writes that in 2019 when he was a computer science student at Stanford University in California, he didn’t need to take any cybersecurity courses to graduate. This, he says, was true for students at 23 of the top 24 computer science schools in America.

Nearly five years later, “that list of the top 24 universities in computer science hasn’t changed: 23 still don’t require cybersecurity,” Cable wrote in his memo.

Cue the coffee spitting.

The University of California, San Diego, for the record, is the only school in the top 24 with a computer science and engineering program that does list security as an undergraduate degree requirement, although it’s unclear if that’s really the case from the college’s curriculum

“Cybersecurity is viewed as a subdiscipline, much like graphics or human-computer interaction – not essential knowledge that every future software developer should be equipped with as they enter the workforce,” Cable laments. “This is unacceptable. All too often, attacks exploit simple weaknesses that any developer with basic security knowledge could have stopped.”

We wholeheartedly agree. Sure, computer science is not engineering, and you may argue that engineering is a more natural home for practical secure coding. Turning an abstract algorithm into a safe software routine, or writing a service that doesn’t blindly trust user input, for example, is an implementation-level issue for engineers. We get it.

But screw it, this situation is unsustainable. Put security in your compsci curriculum for the sake of new developers and the people using their code.

By now the infosec skills shortage is old news, and voices in both the private and public sectors have called on developers to get a grip on vulnerabilities in their software supply chains. Even the White House’s National Cybersecurity Strategy calls for holding application makers liable for security flaws in their products, which will for one thing need better training for programmers.

But if colleges and universities aren’t requiring computer science students to take any infosec classes before they are hired by these companies, look, we’ve got a real problem. It’s one that will contribute to the disconnect between security executives and developers — not to mention the ever-growing threat from ransomware and other destructive cyberattacks.

One of the reasons for these lack of courses, according to CISA, is that the private sector isn’t demanding these skills in its developer hires. In September, the agency hosted a workshop that centered around the challenges in incorporating security into computer science curricula, and one of the hurdles identified was a lack of demand. 

“To date, companies have not expressed that security is one of the key factors they evaluate when hiring software developers,” Cable wrote. “Until that changes, universities have little incentive to change their practices.”

But: Here’s a chance to do something about this. Last month, CISA put out a Request for Information on the role of security in computer science education. Responses are due February 20 and we’ll keep a close eye on what emerges. ®

Source