This article was originally published in ‘Trends in Technology,’ a special advertising section in the 3/24/17 edition of the Cincinnati Business Courier.

By Kevin Kirby
Dean, College of Informatics

I often visit my biologist and physicist friends at work. As I walk down the corridors to their labs, I see hazard signs everywhere.  The black and gold shield for radiation, the interlocked horns of the biohazard sign, and even the skull and crossbones label for poison. Students walk in and out of these labs. Science can be dangerous. We accept this, because science is important.

In my own building, home to the NKU College of Informatics, we have no such signs. We should.

Last month I was in a faculty-student research meeting that hinted at these dangers. We were studying the work of cybersecurity researchers at Microsoft and Penn State. They were using a brain-like model for computing that scanned through 4.5 million files of code, and their system was able to learn what code was malicious and what code was safe. It learned this with 99.6 percent accuracy.

This kind of automated security clearly has great implications for business. But the implications run deeper. Our research group, like Microsoft’s, works in a field called “deep learning.” Chances are, if your business benefits from sophisticated analytics, you have come across this technology by now. In December 2016, the McKinsey Global Institute issued a lengthy report, “The Age of Analytics: Competing in a Data-Driven World.” More than 15 pages of the report were dedicated specifically to deep learning.

The idea behind deep learning is to take layer upon layer of artificial neurons, modeled very loosely on brain cells, expose them repeatedly to data, and have them learn from experience. If you borrow a few high-end video cards from your gamer friends, and trick the cards into crunching data, you can do this with blazing speed. There are amazing tools for this, such as Google’s TensorFlow.

The algorithms behind deep learning go back to the 1950s, and blossomed for a while in the late 1980s. I taught workshops on it for Wright Patterson Air Force Base in the early 1990s. But deep learning did not reach the industrial scale needed for business until only the past year or two. It will be incorporated into our kit of analytics tools, and soon become commonplace.

Let’s think about what this means, as this newly powerful species of tech moves in to our tech ecosystem. As the Internet of Things (IoT) becomes more pervasive, we are surrounded by swarms of smart devices. Whether they are sitting on our shelves listening to us speak, or literally swarming around as micro-drones with eyes, they will soon be learning. They will be learning about the world, and they will be learning about us.

This is the thrill of technology. But it is not a distant leap to the danger and the threats. These are systems that learn through experience rather than by explicit rule-based programming. There’s a loss of control here. Combine this with the usual worries about loss of privacy and the creation of a surveillance culture, then add in the fact that everything is hackable. Suddenly the tech ecosystem looks wild and dangerous.

This has implications for talent.  As our businesses seek out tech savvy professionals, our universities work hard to provide them. It is often the thrills that attract students to tech, and as dean I often talk up tech’s wow-factor. But the absolute worst thing a university can do nowadays is crank out large numbers of one-dimensional tech specialists. The world is too dangerous for that.

A more secure approach is to reinforce a strong technology education with a framework that brings in ethics, responsibility, and the social-global context of tech. It’s not just asking IT majors to take a special IT ethics course (we’ve done that at NKU for 12 years); it’s embedding real-world learning experiences that make students confront the implications of their work as technologists. It’s having them work with actual practitioners, and also hear diverse voices.

It is also about broadening participation. We need to embed computing into the context of a true liberal arts education. My sixth grade teacher made our class write a 14-line sonnet. She should also have made us write 14 lines of code. There can be beauty in both. And in this dangerous world we need to break down this tech/non-tech divide.

Sometimes I feel letting my students work with code is like letting them work with plutonium. We may be getting out of our depth. It is a hazard. But with a bolder transdisciplinary approach to tech education, we just may stay safe after all.

Kevin Kirby is Dean of the College of Informatics and the Evan and Lindsay Stein Professor of Biocomputing at Northern Kentucky University.