"Digital" for den Dummkopfen - Part 1
December 03, 2019

Terry Critchley
Author of "Making It in IT"

Share this

The word "digital" is today thrown around in word and phrase like rice at a wedding and never do two utterances thereof have the same meaning. Common phrases like "digital skills" and "digital transformation" are explained in 101 different ways. As Humpty Dumpty said to Alice; "When I use a word, it means just what I choose it to mean — neither more nor less." This applies to "digital."

The outcome of this is a predictable cycle of confusion, especially at business management level where often the answer to business issues is "more technology."


A term applied to computers which only understand 1s and 0s to differentiate them from their analogue cousins, which understand waveforms, graphs and other visual indicators of numeric values. No arguments here, except for quantum computers which have many more configuration options to play with, not just two, in atomic configurations. I suspect though that it is possible to become "digital" on those too.

Digital Skills Shortage

A term applied to shameful shortfalls of dummkopfen who don't have digital skills and should have. However, acquiring IT (digital) skills does not mean there is a standard level of competence throughout IT and you either have it or you don't. Skill is not binary in that sense and I have a personal mental picture of IT Skill being at four levels for the business-level user:

1. Awareness of the basic use of IT at the level of computer, data and connections between computers; school students should have this and members of the public who interact with computer services. Teaching and learning coding at this level is a waste of time; view from 10,000 feet is far better.

2. Acquaintance with these elements and the ability to enter a discussion about the use of computing. This I feel is the level that non-IT managers in enterprises undergoing digital transformation should possess. For too long they have been spoon-fed systems and applications then complain at the end that the deliverables are not fit for purpose.

This might be avoided if they were in at the start of the project process with the ability to question computing projects sensibly at a pragmatic level. The days when such managers might say; ‘Oh I leave all that stuff to my techie chaps!' are over.

3. Overall IT knowledge, analogous to the type acquired by medics at medical school but not at specialist level. It is a mandatory precursor to any IT specialisation, as I repeat ad nauseam to anyone who will listen.

4. Specialist knowledge in a particular area but only acquired when the person has traversed level 3. above. Ideally, level 4 should be provided by the employer since his requirements of any specialisation will vary from some perceived ‘standard' for that specialisation. This is only partially recognised by employers.

For levels 3. and 4. the education should include enough material to give a level 1. or 2. understanding of topics peripheral to the main themes of IT, such that the person can follow a discussion on these topics. Examples are GDPR (General Data Protection Regulation), TCO (total cost of ownership), ROI (return on investment), RCA (root cause analysis and so on.

This sort of knowledge is essential in digital transformations, depending on the part each person play in it. In addition, broad IT knowledge is the icing on the cake which gives a person the edge over others in the job and career progression stakes.


There is a tendency these days to isolate specialist subject training as if it stands alone in the computing environment. Many courses and training paths today are aimed at specialist subject, such as cybersecurity or data science, without specifying any pre-requisite knowledge. This is a mistake, similar to teaching someone how to sail a boat but neglecting to tell them about the sea, its hazards and its navigation.

To take an analogy, can you imagine a cardiologist reaching his/her exalted position without any other medical training, that is, general medical school? Look at the following quotation which explains this to a "T":

John Muir said: "When we try to pick out anything by itself, we find it hitched to everything else in the Universe."

For example, cybersecurity will require knowledge of networks, their protocols in forensics and wire data plus knowledge of the access points for this data and further some knowledge of data structures and so on. This knowledge will have to be pragmatic and not purely theoretical as it tends to be in computer science education, both in school and in universities.

The specialist too will need to take part in team discussions where part of a project involves his/her specialism but they will find it very hard to communicate with other members of the team who will often be speaking a different language. (Modern computing is a team game, not a game of tennis singles.)

Other Facts about "Digital Skills"

Another factor mandating broad IT skills is that the use of these new "toys" will eventually be judged by executives on their usefulness to business in either ££/$$s or other tangible benefits and NOT the degree of "Gee Whizziness" or trendiness; those criteria are the province of the "geek" and the IT-unwashed media and politicians. In addition, there may well come a time when these tools have run their course for a business and become business as usual (BAU) which leaves the skills incumbent on a sticky wicket. I have seen it happen in my decades-long sojourn in IT in various roles and industry environments.

In any case, which specialized graduate would want to be writing AI algorithms or Python programs from the age of 20 to age 65 or more when he/she reaches retirement? They can't shift jobs easily in the interim without a solid IT background and the alternative to this shift is leaving. Without a doubt, computer job migration is happening in most industries and so the "one-subject" person is highly exposed.

Also, it is said sometimes that the half-life of any job in IT is 18-24 months which means anything from complete job change to a job morphing into something slightly different. Either way, broad skills are necessary to ride this wave of change.

Read Part 2 of this blog.

Share this

The Latest

May 21, 2020

As cloud computing continues to grow, tech pros say they are increasingly prioritizing areas like hybrid infrastructure management, application performance management (APM), and security management to optimize delivery for the organizations they serve, according to ...

May 20, 2020

Businesses see digital experience as a growing priority and a key to their success, with execution requiring a more integrated approach across development, IT and business users, according to Digital Experiences: Where the Industry Stands ...

May 19, 2020

Fully 90% of those who use observability tooling say those tools are important to their team's software development success, including 39% who say observability tools are very important ...

May 18, 2020

As our production application systems continuously increase in complexity, the challenges of understanding, debugging, and improving them keep growing by orders of magnitude. The practice of Observability addresses both the social and the technological challenges of wrangling complexity and working toward achieving production excellence. New research shows how observable systems and practices are changing the APM landscape ...

May 14, 2020
Digital technologies have enveloped our lives like never before. Be it on the personal or professional front, we have become dependent on the accurate functioning of digital devices and the software running them. The performance of the software is critical in running the components and levers of the new digital ecosystem. And to ensure our digital ecosystem delivers the required outcomes, a robust performance testing strategy should be instituted ...