Many people today, especially those graduating with a computer science (CS) or other computing degree think; "I know enough now to get a job doing 'X' and that's me set up." Not quite and for at least two reasons:
■ Jobs in IT are seldom permanently the same ad infinitum because, like a virus, they mutate or morph into a different format. Many people put the half-life of a job as 24 months so in an IT career, one would expect to drift across the computing spectrum of jobs to keep pace with the evolution of computing.
■ Imagine moving into a shiny new specialist subject like AI; great. But wait, can you imagine writing AI algorithms from graduation age, say 20, to somewhere in your 60s? A frightening thought; a specialist frozen in time.
The approach to avoiding this state of affairs is lifelong learning, whereby a person keeps up to date as far as possible to ride the mutation wave and still have a satisfying job. This does not entail going on ponderous course two or three times a year but by assimilating pieces of knowledge "on the fly."
Acquiring IT (or digital) skills does not mean there is a standard level of competence in IT to aspire to which you either have or you don't. IT skill is not a binary entity in that sense and I envisage at least four skill levels for IT oriented people, from beginner to hoary old timer:
1. Awareness of the basic use of IT at the level of computer, data and connections between computers.
2. Acquaintance with these elements and the ability to enter a discussion about the use of computing. This, I feel, is the level that non-IT managers and even executives in enterprises undergoing digital transformation should possess.
3. Overall IT knowledge, analogous to the know-how acquired by students at medical school but not at specialist level. It is a mandatory precursor to any IT specialization.
4. Specialist knowledge in a particular area but only when the person has traversed level 3 above. Ideally, level 4 should be provided by the employer since his requirements of any specialization will vary from some perceived "standard" of their organization's particular one.
How does one acquire skills at the level appropriate to ones' self? Not by reading tomes at various levels; I have tried that and often understand every paragraph I read but still fail to grasp the subject. Sound familiar? It dawned on me that it was better to read a few small articles on the subject, maybe more than once, and eventually you should hit that "Eureka" moment when the topic slips into place.
Keeping Up to Date
What follows is what I learned about learning; over many decades in IT, both at the coal face and later as author and researcher.
This method also allows you to get a consensus on the importance, future and usefulness of a topic or product, thereby eliminating bias and self-praise by a topic fanatic. Not only that, this "little, often and varied" approach allows people to pick up a topic, be it hardware, software or techniques, at various levels of difficulty since the nature of the topic is rarely fully explained in a single article. Scanning several brief sources very often puts the theme together like the pieces of a jigsaw and the subject becomes clear since you will subconsciously "fill in the understanding blanks" as you read. If it doesn't maybe you are in the wrong field of endeavor.
Some years ago, I planned to write a book and wrote a glossary for it. The book never happened but the glossary lived on, was kept current along with my reading and the result was an Amazon Kindle eBook, with topics in the original alphabetic order. It was structured with an overview of each topic followed by reference links to articles, books and videos at various levels of difficulty and marked accordingly.
This allows the novice to pick their way through without suffering a brain explosion and the 'expert' to flex their IT muscles on the heavy stuff. In truth, they will take a peek at the easy stuff and still learn something, I'm sure you will; I learned a lot in compiling it. It is one way of keeping up to date and even learning a topic from scratch.
If you think you know it all already, remember this poignant quotation: The baseball manager Earl Weaver once said, "It's what you learn after you know it all that counts."
A Sample of the eBook
The following is an extract from the eBook which serves to show the structure and scope of the contents. Not every topic is as detailed but important ones give several references in pursuit of little, often and varied.
blockchain: "Blockchains are immutable digital ledger systems implemented in a distributed fashion (i.e. without a central repository) and usually without a central authority. At their most basic level, they enable a community of users to record transactions in a ledger that is public to that community, such that no transaction can be changed once published. This technology became widely known starting in 2008 when it was applied to enable the emergence of electronic currencies where digital transfers of money take place in distributed systems. It has enabled the success of e-commerce systems such as Bitcoin, Ethereum, Ripple, and Litecoin. Because of this, blockchains are often viewed as bound to Bitcoin or possibly e-currency solutions in general. However, the technology is more broadly useful and is available for a variety of applications." [NIST]
There is a more formal NIST definition in the link following but I feel it is not as clear as the above paragraph from the same document.
Blockchain Technology Overview [NIST, 68 pp.]
Another informative definition comes from an Hf\report:
"Blockchain is a distributed ledger used to maintain a continuously growing list of records, called blocks. Each block contains a timestamp and a link to a previous block. By definition, blockchains are inherently resistant to modification of the data. Once recorded, the data in any given block cannot be altered retroactively without the alteration of all subsequent blocks and a collusion of the network majority."
More than 80% of organizations have experienced a significant increase in pressure on digital services since the start of the COVID-19 pandemic, according to a new study conducted by PagerDuty ...
In Episode 9, Sean McDermott, President, CEO and Founder of Windward Consulting Group, joins the AI+ITOPS Podcast to discuss how the pandemic has impacted IT and is driving the need for AIOps ...
Michael Olson on the AI+ITOPS Podcast: "I really see AIOps as being a core requirement for observability because it ... applies intelligence to your telemetry data and your incident data ... to potentially predict problems before they happen."
Enterprise ITOM and ITSM teams have been welcoming of AIOps, believing that it has the potential to deliver great value to them as their IT environments become more distributed, hybrid and complex. Not so with DevOps teams. It's safe to say they've kept AIOps at arm's length, because they don't think it's relevant nor useful for what they do. Instead, to manage the software code they develop and deploy, they've focused on observability ...
The post-pandemic environment has resulted in a major shift on where SREs will be located, with nearly 50% of SREs believing they will be working remotely post COVID-19, as compared to only 19% prior to the pandemic, according to the 2020 SRE Survey Report from Catchpoint and the DevOps Institute ...