2021 here we come

For this planet we call Earth, 2020 has been somewhat of an “odd” year. Not just because in November 2019 a new coronavirus was discovered with extremely rapid spreading capabilities and lethal tendencies, not just because everyone you know, knows someone who lost someone, and not just because the political turmoil around the world (especially the “Western World”) takes us into 2021 with a level of apprehension many of us cannot comprehend. 2020 sucks, that’s a fact, for 90-something-percent of us. However, even at the end of the darkest tunnels, there is light, even if it’s just the flicker of a candle, there is light at the end of the tunnel, for all of us.

Everyone knows the negatives of 2020, I won’t list them, there are many. However, we, as a human race, tend to focus on the negatives too much, so I will list some positives to think about as we head towards 2021. Pandora’s box has opened, now it’s time to find the glimmer of hope.

2020 saw achievements by many, some of these were driven by need and necessity, others were years in the making. And, whilst the year as a whole may have taken the shine off them, we should celebrate them.

  • We sent people into space, on a reusable rocket and capsule. – https://www.youtube.com/watch?v=E_FIaPBOJgc
  • Scientists worked at incredible rates to work on a vaccine for Covid-19, resulting (at this point in time) in at least 3 or 4 viable vaccines that could be available to the public before the year ends. – https://www.wired.co.uk/article/coronavirus-vaccines
  • Innovation and the ability to adapt has meant that whilst some companies folded, and many individuals have sadly lost their jobs or been furloughed, some companies were smart enough to rise to the challenge. Digital Transformation strategies and utilisation of technology have helped lots of companies overcome the challenge, in some cases increase their output, or in other cases adapt their products and services to meet new demands. In the UK, for example, we had many manufacturing companies come together to use their dormant machinery to make ventilators, PPE, girders for temporary hospitals, etc.
  • We are still here. The planet hasn’t been totally destroyed, and whilst our mental health teeters on the edge, many of us have found a new appreciation for those around us whilst learning new techniques and behaviours for getting through the day, week, month, and year. It is these new found skills and attitudes to life that will see us in good stead for the years ahead, whether the next event is closer to home, or on a global scale.

Ok, so not going to lie, I struggled to find enough positive stories from this year, but those above are the ones I believe in, the ones that have peaked my interests and allowed me to have some positive vibes to get through.

On a personal/career level, 2020 was supposed to be a year of self-discovery and development. Starting a new role in October 2019 with a brand new remit and direction was incredibly scary, but a challenge I relished. Fortunately, that role survived the turmoil around Covid-19 and I’m still here to tell many tales. I have been able to upskill, adding to my “jack of all trades” mantra that sees me now capable of some very interesting and potentially dangerous things, as well as embedding myself into a well established and very experienced team, relatively seamlessly. 2021 will see me continue in this role, with the same people and adding various activities along the way. I am glad for this consistency in my work life as the world around us continues to be so unpredictable.

The Future is Bright…

When news breaks of a large tech company sale, in the midst of a global pandemic, people will sit up and take note. It’s fair to say this one is huge news, and the future is indeed very bright.

https://www.epicor.com/en-uk/press-room/news-releases/clayton-dubilier-rice-to-acquire-epicor-software-corporation-from-kkr/

On a personal note this will be the 4th acquisition/takeover/purchase of a company I work for in 5 years, see previous posts for notes on those!

Q1 FY20 Part 2

Welcome to Part 2 of the end of year summary on the career side of things. 2019 ended on a high at work, and the previous post (here) started looking at the new job role and my initial project of migrating systems between domains. This project was effectively a merge of all my previous skills and a way to develop my new skillset required to take the role forwards into 2020 and beyond.

It was mentioned in part one that the infrastructure project effectively became a DevOps implementation project, and I’ll try to delve into the bits I can discuss in this post. Firstly there’s the inheritance; over 300 environments, many with specific mods for specific customers, and the remaining with what we call “Extended Solutions” – productized mods effectively. Then there was all the code itself, fortunately, the existing dev team has a fantastic grasp of the deep dark secrets of Git and I have enough of a basic understanding to pick up where others left off, but then, something new to myself came out, and that was the builds of the code. Debug or Release, MSBuild versions, semantic versioning, Git Flow… you get the idea. All very complicated to me at the time, but now it’s in my veins!

Previously the team used an old, unsupported, broken version of Jenkins to build their code, with definitions for each Git Branch, some with bat files, some hardcoded in the Jenkins config screen, basically a mess, and different rules for different people. Well, I like to standardize, so we scrapped the old, and brought in the new. Cue the amazing concept that is Continuous Integration. Having some basic experience with this in Azure DevOps (remember this post?) the concept was not new, however, implementing with Jenkins was a new experience. I managed to inherit a new blank Jenkins server, but first thing I did was reverse proxy via IIS and get it secured with an internal SSL certificate, as well as connect up to the Corporate Active Directory and restrict to our team only, then we got a service account from Corp IT and locked the server down, only myself and Domain Admins can get into the back end, and now we have a secured build server. Why so secure when it’s all internal? Well, that’s because the CTO office allowed us to have the corporate digital cert for code signing so long as it only existed in one, locked-down place. We can call it via the Jenkins application side but not extract, manipulate or otherwise interact with it. With this new updated (and updateable) Jenkins server, plus a couple of useful plugins (Blue Ocean is a must) we have a fantastic platform to manage and analyze our build process. The main feature we are utilizing is the Multibranch Pipelines via Jenkinsfile. This Jenkinsfile is written in groovy and is basically a set of instructions that define a build, for example, we can say build this solution, sign using that certificate and publish the artifacts so we can download afterward. The huge advantage for us is because we build a solution for multiple ERP versions, we can have up to 8 exes output at the end, and we now have one screen to grab them from, regardless of what Git branch we built for. On the subject of Git branches, due to the multibranch pipeline functionality, once our Jenkinsfile is pulled into Master, it will then filter down to all subsequent branches, and with this feature enabled, Jenkins will detect any new branches that were pushed back to the origin with that Jenkinsfile includes. I’ve previously written about VS Code and all the wondrous things it does, but we also discovered a Jenkinsfile checker in the form of https://marketplace.visualstudio.com/items?itemName=janjoerke.jenkins-pipeline-linter-connector. This tool allows us to check syntax, against our own Jenkins server and therefore ensures an accurate rule definition every time we adjust a build. It’s time-savers like these that have boosted the team’s productivity significantly, I recently tweeted about this improvement, as I took hold of an existing codebase and fully integrated it into our new philosophy within an hour or so!

A large part of the battle has been documenting the configuration and the overall process as well as educating colleagues (primarily developers) about how we are using these concepts and tools. I’ve found the majority of developers know the concepts and will have their own experiences with Git and CI/CD but until you document what the process should be in the exact circumstances, they don’t necessarily see the advantages or understand just how powerful these changes are, and more importantly how it improves consistency and productivity across the team. The improvements are already showing for us, and I fully expect that to continue!

Some reference material for a few of the things discussed in the post above:

Q1 FY20 Part 1

Back on October 1st 2019 I decided to take a leap of faith and join the dark side, this has resulted in me becoming the UK’s first dedicated QA for our Custom development teams. Effectively we develop, using our own SDK and lots of other clever tools, the things that customers would love to have, but which do not come out of the box. Having visited many customer sites in the last couple of years, I have nothing but appreciation for the quality and depth of work this team produces, and it’s absolutely my pleasure to be a part of it going forwards.

My first task, set on Day 1 was to own the migration of systems from one domain to the other. Those who have followed any of my previous posts in the last 4 years will be aware I went from small local company to global ERP vendor overnight (June 1st 2016) by way of an acquisition. Well imagine moving that small dev team’s environments into a very well protected and governed American corporate ecosystem, it was effectively sat on for 3 years, and corporate policies dictated we migrated and shutdown the old!

Deciding where to start was easy… Spend a week or so working on testing out a couple of theories, having done domain migrations previously, and work with internal IT teams to put in the relevant requests and procedures to ensure those theories are robust, scalable and secure. Three weeks in and hours had been wasted scripting out a copy and paste scenario, basically a load of PowerShell scripts to do Find/Replace style blitz across 1000s of files, 10 different ERP versions, 200+ development environments (with Databases). Only the one slight snag, even after reworking permissions and roping IT into a 3TB file copy across 2 unconnected domains…. Internally developed environmnent management tooling, which with all its bells and whistles, was not supportive of the new domain, and had hardcoded ties to the older domain’s file server, oops.

Rethink time… Plan B – the best of the lot. Copying databases is one of those things I literally wrote the manual on for Epicor ERP, so that’s easy; building Windows servers has been the last 10 years of my life, so again, sorted; that leaves my understanding of the tooling that sits in the middle, well, fortunately, my new desk backs on to the lovely chap who wrote that tool, even though he now runs our R&D division, so with a few conversations and about 8 lines of code he rebuilt it for me to work on the new environments, allowing me to fully document it as it got deployed and hey presto, a working blank set of servers ready for migrated data was born within a week; including the ability to build any version of ERP 10, using blank, demo or customer data – depending on whether it’s development or QA work, and the ability to use all the latest features and more importantly the latest development tools, by way of Chocolatey!

The next few weeks consisted of identifying what needed to be moved, and what we could spin up later on demand, the resulting list was around 120 required environments, mostly because of productised “Extended Solutions” which need to be built for each version of ERP 10 we support. But also ongoing customer projects, version uplifts, test environments for developers to test their own theories and boost their skills etc. This was a very slow and involved process, per Environment/DB it was not too bad, but in Part 2 (when I write it) I’ll go through how my domain migration project became an environment and process improvement project, featuring Git, Jenkins, CI/CD and

The good news is my Domain Migration which we scheduled to be fully complete, i.e old domain shut down for 24th December 2019, was in fact completed on 6th December 2019, so despite the slightly wasted 3 weeks of testing, scripting, and familiarisation, with all parties on board we (sadly) shutdown the Dot Net IT domain at 17:30 that evening!

This and That

Over the last few month I’ve tried to expand my horizons a little bit. Since 2009 I have worked in a few different technical roles, from helping to run data centres, and setup environments for ISV engagements at IBM, to running all systems for a rapidly growing Oracle partner, whilst on the side managing 100 websites including e-commerce sites. That led into my quick stint doing tech support in the Automotive sector before moving into customer facing roles in Jan 2016. Since then I’ve always been running on a few different threads, these have been, loosely:

  • Installs/Config for ERP systems including initial system design
  • Technical training of customers in those ERP systems
  • Technical management of escalated issues (across the world)
  • Cross-team liason for high profile or highly escalated customers
  • Coordination of international team of installations consultants
  • Development of internal tooling for installs/ technical consulting
  • Management of environments for wider team

From my recent posts it’s obvious which areas on that list have received the most focus over the last few months, notably the last two, which is where all the DevOps/Code posts are centred around. The reason so much focus has been on this, and I’ll add at this point a lot of it out of work hours, is because it’s something I enjoy, something I’ve been on the edge of before, and an area of technology that I personally believe we should all be at least aware of, and able to understand the basic principles of.

DevOps was a term coined many years before it became mainstream. Mike Loukides wrote a 20 page book called “What is DevOps” back in June 2012, which is published by the world renowned O’Reilly Media. (http://shop.oreilly.com/product/0636920026822.do) That’s some time before I came across the term, although it seems I was already aware of some of the practices that now come under that umbrella. Back then I was managing E-Commerce sites, writing PHP websites against MySQL databases and moving a very static, cumbersome “tin-factory” infrastructure over to more dynamic, sustainable growth-capable platform. With a little more time and knowledge at that time I would’ve potentially moved in different directions. I am now starting to close that circle a little from the other side.

For me, career development is crucial, I am more than happy to stay with one company, or in one role, but I will always push to make more of myself, learn new things, get involved with everything possible and break down any and all barriers. I don’t do this to benefit myself, I see it as an opprtunity for me to be a benefit to those around me, both customers and colleagues.

Outside of DevOps activities over the recent months I’ve also been working on my presentation skills, with opportunities to present to colleagues and customers about various technical topics, including System Adminstration, upcoming product changes, best practices etc. This is in part due to being given more free reign with my current role, while we work out what my future roles may or may not include, and that’s if any change at all! In the background, the day to role keeps me busy, planning installs, speaking to new customers about how to deploy, speaking to existing customers about upgrades or enhancements to their systems, all the fun stuff that keeps money in the bank and roofs over heads!

The next few months may get a little busy, well hopefully they will, and all the good stuff will be posted when the chances arise.