I believe that in the UK and US there is a lack, nay absence, of pragmatic computing education which matches the needs of the current business world of information technology (IT). Current computer education, school and university, appears to me to be computer science based, very theoretical and does not follow the logical sequence of activity in the development, use and management of business applications that I observed in my long IT career spanning many industries.
(In this blog I use "business" in its broadest sense to mean the "world of work," be it commercial, scientific, medical or industrial.)
In fact, the curricula appear to me to be a collection of topics with little synergy and no end-to-end flow which IT projects have. As an analogy, consider the following scenario which I believe is a parallel to this.
A technical course on the motor car is run at Knowalot College, covering the internals of the car; Carnot cycle, adiabatic expansion, electronic ignition etc.; very detailed and demanding. At the end of the course, the student will probably have no concept of the motor car as vehicle, might not know how to drive, read a map or plan a journey from A to B. It is almost certain that he/she will not know how to decide on which car, van or lorry to recommend for the business he works for. In short, he/she is doomed to be a head-under-the-bonnet techie forever. That job of course is necessary but it cannot be classed as covering "motor transport," simply a technical corner of it.
Not only that, but the words "business" or "requirements" do not even appear anywhere in CS curricula I have searched. Only under the title "problem solving" could one guess that it refers to business. This is not to say CS education per se is bad; it just isn't a comfortable fit to the current computing world although it is gradually finding a niche in various areas of computing. These areas include big data, data science, cognitive and similar computing, and cybersecurity.
However, a broader knowledge across key IT concepts and architectures is needed since no person in IT is an island and anyone totally specialized will find it difficult to cross-communicate where his/her field overlaps with another, particularly in meetings or presenting to the business.
What Are the Differences?
In this part of blog, I will try to demonstrate this CS vs. IT dichotomy but first some outside view of the differences between CS and IT:
The proposition I put to CS people as to what modern IT is goes roughly as follows:
■ IT needs to be presented as sequence of related activities within a framework, not a simple collection of topics.
The flow of IT projects can be represented as:
- Business idea/need
- Specification of business flow
- IT Architecture (product-free)
- Populate the design with Technology
- Code/Buy software
- Retire systems and Start again
(There will of course be reviews and the like throughout this sequence of activity.)
You can see "coding" in context here; students and teachers cannot see this far.
■ There should be a pragmatic, contextual "wrapping" around major topics, for example, "this is used in the oil industry to map the subsea strata in the search for oil deposits." – the "so what?" test.
■ Emphasize important aspects of IT as a framework in which to teach topics. Over the years I have decided that FUMPAS represent the key elements (others can be found within these):
These are the criteria to map onto any business IT project to whatever degree of detail (reflecting its importance) the business decides.
■ Two large topics totally absent from CS curricula are mainframes, their operating systems and high performance computing (HPC). Much of the world's financial work is done on mainframes and its influence is growing, believe it or not. HPC computing is now a big field and is expanding beyond pure science into medicine, financial modelling, AI and other power hungry areas. Not to even mention them is dereliction of IT teaching duty, whatever the syllabus mandates. This sort of add-on could be done by selection of a suitable reading list, even if it is not in the syllabus.
CS school and university syllabuses I have studied do not fit the "real world" IT scene in breadth, depth or velocity of change and I therefore generated a keyword list to demonstrate this dichotomy. The list then developed into a learning Glossary, now on Amazon Kindle (check tomorrow for Part 2 of this blog), to show where IT fits in the business world and the topics which make it tick. The CS world can then see if their output matches these requirements.
So what? The world has gone mad on the "digital revolution" impacting nearly all business. I believe this issue needs to be addressed vigorously and quickly to tackle the much discussed "IT skills shortage." The current computer education, at least in the UK, will not achieve this aim, still less cater for the skills needs post-Brexit. I see no difference between UK CS and US CS, ergo much of what I say also applies the US.
Finally, I cannot find a syllabus anywhere I have looked that remotely covers IT as demonstrated by the list and subsequently the Glossary. I see this as a start in resolving the "IT Skills issue," a mantra that has been trotted out since the year 2000, if not earlier.
As Mark Twain said; "Everybody is talking about the weather, nobody is doing anything about it." I hope the Glossary is a beginning.
Use of artificial intelligence (AI) in digital commerce is generally considered a success, according to a survey by Gartner, Inc. About 70 percent of digital commerce organizations surveyed report that their AI projects are very or extremely successful ...
Most organizations are adopting or considering adopting machine learning due to its benefits, rather than with the intention to cut people’s jobs, according to the Voice of the Enterprise (VoTE): AI & Machine Learning – Adoption, Drivers and Stakeholders 2018 survey conducted by 451 Research ...
AI (Artificial Intelligence) and ML (Machine Learning) are the number one strategic enterprise IT investment priority in 2018 (named by 33% of enterprises), taking the top spot from container management (28%), and clearly leaving behind DevOps pipeline automation (13%), according to new EMA research ...
Although Windows and Linux were historically viewed as competitors, modern IT advancements have ensured much needed network availability between these ecosystems for redundancy, fault tolerance, and competitive advantage. Software that offers intelligent availability enables the dynamic transfer of data and its processing to the best execution environment for any given purpose. That may be on-premises, in the cloud, in containers, in Windows, or in Linux ...
TEKsystems released the results of its 2018 Forecast Reality Check, measuring the current impact of market conditions on IT initiatives, hiring, salaries and skill needs. Here are some key results ...
Retailers that have readily adopted digital technologies have experienced a 6% CAGR revenue growth over a 3-year period, while other retailers that have explored digital without a full commitment to broad implementation experienced flat growth over the same period ...
As businesses look to capitalize on the benefits offered by the cloud, we've seen the rise of the DevOps practice which, in common with the cloud, offers businesses the advantages of greater agility, speed, quality and efficiency. However, achieving this agility requires end-to-end visibility based on continuous monitoring of the developed applications as part of the software development life cycle ...
I believe that in the UK and US there is a lack, nay absence, of pragmatic computing education which matches the needs of the current business world of information technology (IT) ...
Half (52%) of consumers worldwide are now using Internet of Things (IoT) devices, yet 64% of those have already encountered performance issues – according to a global survey of 10,000 consumers conducted by Dynatrace ...