"Digital" for den Dummkopfen - Part 1
December 03, 2019

Terry Critchley
Author of "Making It in IT"

Share this

The word "digital" is today thrown around in word and phrase like rice at a wedding and never do two utterances thereof have the same meaning. Common phrases like "digital skills" and "digital transformation" are explained in 101 different ways. As Humpty Dumpty said to Alice; "When I use a word, it means just what I choose it to mean — neither more nor less." This applies to "digital."

The outcome of this is a predictable cycle of confusion, especially at business management level where often the answer to business issues is "more technology."

Digital

A term applied to computers which only understand 1s and 0s to differentiate them from their analogue cousins, which understand waveforms, graphs and other visual indicators of numeric values. No arguments here, except for quantum computers which have many more configuration options to play with, not just two, in atomic configurations. I suspect though that it is possible to become "digital" on those too.

Digital Skills Shortage

A term applied to shameful shortfalls of dummkopfen who don't have digital skills and should have. However, acquiring IT (digital) skills does not mean there is a standard level of competence throughout IT and you either have it or you don't. Skill is not binary in that sense and I have a personal mental picture of IT Skill being at four levels for the business-level user:

1. Awareness of the basic use of IT at the level of computer, data and connections between computers; school students should have this and members of the public who interact with computer services. Teaching and learning coding at this level is a waste of time; view from 10,000 feet is far better.

2. Acquaintance with these elements and the ability to enter a discussion about the use of computing. This I feel is the level that non-IT managers in enterprises undergoing digital transformation should possess. For too long they have been spoon-fed systems and applications then complain at the end that the deliverables are not fit for purpose.

This might be avoided if they were in at the start of the project process with the ability to question computing projects sensibly at a pragmatic level. The days when such managers might say; ‘Oh I leave all that stuff to my techie chaps!' are over.

3. Overall IT knowledge, analogous to the type acquired by medics at medical school but not at specialist level. It is a mandatory precursor to any IT specialisation, as I repeat ad nauseam to anyone who will listen.

4. Specialist knowledge in a particular area but only acquired when the person has traversed level 3. above. Ideally, level 4 should be provided by the employer since his requirements of any specialisation will vary from some perceived ‘standard' for that specialisation. This is only partially recognised by employers.

For levels 3. and 4. the education should include enough material to give a level 1. or 2. understanding of topics peripheral to the main themes of IT, such that the person can follow a discussion on these topics. Examples are GDPR (General Data Protection Regulation), TCO (total cost of ownership), ROI (return on investment), RCA (root cause analysis and so on.

This sort of knowledge is essential in digital transformations, depending on the part each person play in it. In addition, broad IT knowledge is the icing on the cake which gives a person the edge over others in the job and career progression stakes.

Specialization

There is a tendency these days to isolate specialist subject training as if it stands alone in the computing environment. Many courses and training paths today are aimed at specialist subject, such as cybersecurity or data science, without specifying any pre-requisite knowledge. This is a mistake, similar to teaching someone how to sail a boat but neglecting to tell them about the sea, its hazards and its navigation.

To take an analogy, can you imagine a cardiologist reaching his/her exalted position without any other medical training, that is, general medical school? Look at the following quotation which explains this to a "T":

John Muir said: "When we try to pick out anything by itself, we find it hitched to everything else in the Universe."

For example, cybersecurity will require knowledge of networks, their protocols in forensics and wire data plus knowledge of the access points for this data and further some knowledge of data structures and so on. This knowledge will have to be pragmatic and not purely theoretical as it tends to be in computer science education, both in school and in universities.

The specialist too will need to take part in team discussions where part of a project involves his/her specialism but they will find it very hard to communicate with other members of the team who will often be speaking a different language. (Modern computing is a team game, not a game of tennis singles.)

Other Facts about "Digital Skills"

Another factor mandating broad IT skills is that the use of these new "toys" will eventually be judged by executives on their usefulness to business in either ££/$$s or other tangible benefits and NOT the degree of "Gee Whizziness" or trendiness; those criteria are the province of the "geek" and the IT-unwashed media and politicians. In addition, there may well come a time when these tools have run their course for a business and become business as usual (BAU) which leaves the skills incumbent on a sticky wicket. I have seen it happen in my decades-long sojourn in IT in various roles and industry environments.

In any case, which specialized graduate would want to be writing AI algorithms or Python programs from the age of 20 to age 65 or more when he/she reaches retirement? They can't shift jobs easily in the interim without a solid IT background and the alternative to this shift is leaving. Without a doubt, computer job migration is happening in most industries and so the "one-subject" person is highly exposed.

Also, it is said sometimes that the half-life of any job in IT is 18-24 months which means anything from complete job change to a job morphing into something slightly different. Either way, broad skills are necessary to ride this wave of change.

Read Part 2 of this blog.

Dr. Terry Critchley is an IT consultant and author who previously worked for IBM, Oracle and Sun Microsystems
Share this

The Latest

April 24, 2024

Over the last 20 years Digital Employee Experience has become a necessity for companies committed to digital transformation and improving IT experiences. In fact, by 2025, more than 50% of IT organizations will use digital employee experience to prioritize and measure digital initiative success ...

April 23, 2024

While most companies are now deploying cloud-based technologies, the 2024 Secure Cloud Networking Field Report from Aviatrix found that there is a silent struggle to maximize value from those investments. Many of the challenges organizations have faced over the past several years have evolved, but continue today ...

April 22, 2024

In our latest research, Cisco's The App Attention Index 2023: Beware the Application Generation, 62% of consumers report their expectations for digital experiences are far higher than they were two years ago, and 64% state they are less forgiving of poor digital services than they were just 12 months ago ...

April 19, 2024

In MEAN TIME TO INSIGHT Episode 5, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses the network source of truth ...

April 18, 2024

A vast majority (89%) of organizations have rapidly expanded their technology in the past few years and three quarters (76%) say it's brought with it increased "chaos" that they have to manage, according to Situation Report 2024: Managing Technology Chaos from Software AG ...