The word "digital" is today thrown around in word and phrase like rice at a wedding and never do two utterances thereof have the same meaning. Common phrases like "digital skills" and "digital transformation" are explained in 101 different ways. As Humpty Dumpty said to Alice; "When I use a word, it means just what I choose it to mean — neither more nor less." This applies to "digital."
The outcome of this is a predictable cycle of confusion, especially at business management level where often the answer to business issues is "more technology."
A term applied to computers which only understand 1s and 0s to differentiate them from their analogue cousins, which understand waveforms, graphs and other visual indicators of numeric values. No arguments here, except for quantum computers which have many more configuration options to play with, not just two, in atomic configurations. I suspect though that it is possible to become "digital" on those too.
Digital Skills Shortage
A term applied to shameful shortfalls of dummkopfen who don't have digital skills and should have. However, acquiring IT (digital) skills does not mean there is a standard level of competence throughout IT and you either have it or you don't. Skill is not binary in that sense and I have a personal mental picture of IT Skill being at four levels for the business-level user:
1. Awareness of the basic use of IT at the level of computer, data and connections between computers; school students should have this and members of the public who interact with computer services. Teaching and learning coding at this level is a waste of time; view from 10,000 feet is far better.
2. Acquaintance with these elements and the ability to enter a discussion about the use of computing. This I feel is the level that non-IT managers in enterprises undergoing digital transformation should possess. For too long they have been spoon-fed systems and applications then complain at the end that the deliverables are not fit for purpose.
This might be avoided if they were in at the start of the project process with the ability to question computing projects sensibly at a pragmatic level. The days when such managers might say; ‘Oh I leave all that stuff to my techie chaps!' are over.
3. Overall IT knowledge, analogous to the type acquired by medics at medical school but not at specialist level. It is a mandatory precursor to any IT specialisation, as I repeat ad nauseam to anyone who will listen.
4. Specialist knowledge in a particular area but only acquired when the person has traversed level 3. above. Ideally, level 4 should be provided by the employer since his requirements of any specialisation will vary from some perceived ‘standard' for that specialisation. This is only partially recognised by employers.
For levels 3. and 4. the education should include enough material to give a level 1. or 2. understanding of topics peripheral to the main themes of IT, such that the person can follow a discussion on these topics. Examples are GDPR (General Data Protection Regulation), TCO (total cost of ownership), ROI (return on investment), RCA (root cause analysis and so on.
This sort of knowledge is essential in digital transformations, depending on the part each person play in it. In addition, broad IT knowledge is the icing on the cake which gives a person the edge over others in the job and career progression stakes.
There is a tendency these days to isolate specialist subject training as if it stands alone in the computing environment. Many courses and training paths today are aimed at specialist subject, such as cybersecurity or data science, without specifying any pre-requisite knowledge. This is a mistake, similar to teaching someone how to sail a boat but neglecting to tell them about the sea, its hazards and its navigation.
To take an analogy, can you imagine a cardiologist reaching his/her exalted position without any other medical training, that is, general medical school? Look at the following quotation which explains this to a "T":
John Muir said: "When we try to pick out anything by itself, we find it hitched to everything else in the Universe."
For example, cybersecurity will require knowledge of networks, their protocols in forensics and wire data plus knowledge of the access points for this data and further some knowledge of data structures and so on. This knowledge will have to be pragmatic and not purely theoretical as it tends to be in computer science education, both in school and in universities.
The specialist too will need to take part in team discussions where part of a project involves his/her specialism but they will find it very hard to communicate with other members of the team who will often be speaking a different language. (Modern computing is a team game, not a game of tennis singles.)
Other Facts about "Digital Skills"
Another factor mandating broad IT skills is that the use of these new "toys" will eventually be judged by executives on their usefulness to business in either ££/$$s or other tangible benefits and NOT the degree of "Gee Whizziness" or trendiness; those criteria are the province of the "geek" and the IT-unwashed media and politicians. In addition, there may well come a time when these tools have run their course for a business and become business as usual (BAU) which leaves the skills incumbent on a sticky wicket. I have seen it happen in my decades-long sojourn in IT in various roles and industry environments.
In any case, which specialized graduate would want to be writing AI algorithms or Python programs from the age of 20 to age 65 or more when he/she reaches retirement? They can't shift jobs easily in the interim without a solid IT background and the alternative to this shift is leaving. Without a doubt, computer job migration is happening in most industries and so the "one-subject" person is highly exposed.
Also, it is said sometimes that the half-life of any job in IT is 18-24 months which means anything from complete job change to a job morphing into something slightly different. Either way, broad skills are necessary to ride this wave of change.
Read Part 2 of this blog.
Industry experts offer thoughtful, insightful, and often controversial predictions on how APM and related technologies will evolve and impact business in 2020. Part 2 covers AIOps, AI and Machine Learning (ML) ...
As the New Year approaches, it is time for APMdigest's 10th annual list of Application Performance Management (APM) predictions. Industry experts offer thoughtful, insightful, and often controversial predictions on how APM and related technologies will evolve and impact business in 2020 ...
Enterprises with services operating in the cloud are overspending by millions due to inefficiencies with their apps and runtime environments, according to a poll conducted by Lead to Market, and commissioned by Opsani. 69 Percent of respondents report regularly overspending on their cloud budget by 25 percent or more, leading to a loss of millions on unnecessary cloud spend ...
For IT professionals responsible for upgrading users to Windows 10, it's crunch time. End of regular support for Windows 7 is nearly here (January 14, 2020) but as many as 59% say that only a portion of their users have been migrated to Windows 10 ...
Application performance monitoring (APM) has become one of the key strategies adopted by IT teams and application owners in today’s era of digital business services. Application downtime has always been considered adverse to business productivity. But in today’s digital economy, what is becoming equally dreadful is application slowdown. When an application is slow, the end user’s experience accessing the application is negatively affected leaving a dent on the business in terms of commercial loss and brand damage ...