2021-2022 – What happened?

The last time I blogged on here was to see out the Covid-ridden 2020 and see in 2021, I then forgot to blog for 2 years. Sorry about that.

2021 was more of the same, focus on work and home life balance, getting used to permanent work from home and was overall a pretty quiet year. 2022 then saw some changes, some adaptations to be made on a personal and professional level, some losses, and some gains. There was emotional turmoil and a big bang to end when finally everyone got together to share stories, and ideas and generate optimism for the future. Some 2021/2022 items made it to my personal blog at https://johnnyward.me.uk for those interested!

So what comes next…

Well, 2023 will be a big year, watch this space.

2021 here we come

For this planet we call Earth, 2020 has been somewhat of an “odd” year. Not just because in November 2019 a new coronavirus was discovered with extremely rapid spreading capabilities and lethal tendencies, not just because everyone you know, knows someone who lost someone, and not just because the political turmoil around the world (especially the “Western World”) takes us into 2021 with a level of apprehension many of us cannot comprehend. 2020 sucks, that’s a fact, for 90-something-percent of us. However, even at the end of the darkest tunnels, there is light, even if it’s just the flicker of a candle, there is light at the end of the tunnel, for all of us.

Everyone knows the negatives of 2020, I won’t list them, there are many. However, we, as a human race, tend to focus on the negatives too much, so I will list some positives to think about as we head towards 2021. Pandora’s box has opened, now it’s time to find the glimmer of hope.

2020 saw achievements by many, some of these were driven by need and necessity, others were years in the making. And, whilst the year as a whole may have taken the shine off them, we should celebrate them.

  • We sent people into space, on a reusable rocket and capsule. – https://www.youtube.com/watch?v=E_FIaPBOJgc
  • Scientists worked at incredible rates to work on a vaccine for Covid-19, resulting (at this point in time) in at least 3 or 4 viable vaccines that could be available to the public before the year ends. – https://www.wired.co.uk/article/coronavirus-vaccines
  • Innovation and the ability to adapt has meant that whilst some companies folded, and many individuals have sadly lost their jobs or been furloughed, some companies were smart enough to rise to the challenge. Digital Transformation strategies and utilisation of technology have helped lots of companies overcome the challenge, in some cases increase their output, or in other cases adapt their products and services to meet new demands. In the UK, for example, we had many manufacturing companies come together to use their dormant machinery to make ventilators, PPE, girders for temporary hospitals, etc.
  • We are still here. The planet hasn’t been totally destroyed, and whilst our mental health teeters on the edge, many of us have found a new appreciation for those around us whilst learning new techniques and behaviours for getting through the day, week, month, and year. It is these new found skills and attitudes to life that will see us in good stead for the years ahead, whether the next event is closer to home, or on a global scale.

Ok, so not going to lie, I struggled to find enough positive stories from this year, but those above are the ones I believe in, the ones that have peaked my interests and allowed me to have some positive vibes to get through.

On a personal/career level, 2020 was supposed to be a year of self-discovery and development. Starting a new role in October 2019 with a brand new remit and direction was incredibly scary, but a challenge I relished. Fortunately, that role survived the turmoil around Covid-19 and I’m still here to tell many tales. I have been able to upskill, adding to my “jack of all trades” mantra that sees me now capable of some very interesting and potentially dangerous things, as well as embedding myself into a well established and very experienced team, relatively seamlessly. 2021 will see me continue in this role, with the same people and adding various activities along the way. I am glad for this consistency in my work life as the world around us continues to be so unpredictable.

The Future is Bright…

When news breaks of a large tech company sale, in the midst of a global pandemic, people will sit up and take note. It’s fair to say this one is huge news, and the future is indeed very bright.

https://www.epicor.com/en-uk/press-room/news-releases/clayton-dubilier-rice-to-acquire-epicor-software-corporation-from-kkr/

On a personal note this will be the 4th acquisition/takeover/purchase of a company I work for in 5 years, see previous posts for notes on those!

5 Useful SendGrid Tips

I recently started working on a concept of a message list subscription service, to be integrated directly into new product features so that people can add themselves to the list to receive information of updates on specific features. This led me down the path of using Azure Storage and SendGrid, building 2 raw prototype apps (in WinForms at this early stage) to achieve both the subscription and the sending out of data to the subscribers. Writing this on.NetFramework 4.8 with C# in a fairly native way has given me a great way to investigate and utilise the SendGrid platform to send emails to multiple subscribers.

I wanted to share my findings, rather than my code at this stage, and here are my Top 5 SendGrid tips:

  1. Do review Microsoft’s guide on how to get started with SendGrid via Azure, this simple article gives you a great base on how to set up the back-end and start writing C# around it: https://docs.microsoft.com/en-us/azure/sendgrid-dotnet-how-to-send-email
  2. Use Unsubscribe groups – these allow you to stay within GDPR in Europe, appending an Unsubscribe and Manage Preferences link to the bottom of the emails that are sent. i.e.
Unsubscribe preferences from SendGrid
  1. Write both plain text and HTML content for your email, I ended up creating an HTML builder so I could convert Email addresses to <a href="mailto:EmailAddress"> and any HTTP(s):// links to <a href:"http://link"> this was a great experiment with regular expressions. Note, I found that not all Email clients auto-convert, e.g. Outlook will convert an Email address but not a URL. I was creating my body text in a Rich Text Box, so it was just one big string!
  2. Look into Personalisations (https://sendgrid.com/docs/for-developers/sending-email/personalizations/) these are crucial if you want to customise your output (the emails themselves). I created variables to use in subject and body lines so I could pass in {customer} and {product}. SendGrid has a substitution capability which comes as part of personalisation. I’ll share a snippet of code here, as I found a solution on StackOverflow (https://stackoverflow.com/a/53292550) but it needed a slight tweak, changing the Tos.Add to a msg.AddTo as the To: Email address needs to be a part of the personalisation, I wasn’t sure if this was to do with API changes or not! Another tip, if you want the subject to be the same, without substitutions you can use the SetGlobalSubject method
var personalizationIndex = 0;
                foreach (var subscriber in subscriberEntities)
                {
                    msg.AddTo(new EmailAddress(subscriber.RowKey, subscriber.Name), personalizationIndex);
                    msg.AddSubstitution("{product}", product.Description, personalizationIndex);
                    msg.AddSubstitution("{customer}", subscriber.Name, personalizationIndex);
                    personalizationIndex++;
                    
                }

  1. Create a test email method, that uses the same logic as your core one, but only sends to one email address. You don’t want to be sending bulk emails out without checking them first!

I hope this new style of thread has been an interesting read, I hope to do more very soon!

Q1 FY20 Part 2

Welcome to Part 2 of the end of year summary on the career side of things. 2019 ended on a high at work, and the previous post (here) started looking at the new job role and my initial project of migrating systems between domains. This project was effectively a merge of all my previous skills and a way to develop my new skillset required to take the role forwards into 2020 and beyond.

It was mentioned in part one that the infrastructure project effectively became a DevOps implementation project, and I’ll try to delve into the bits I can discuss in this post. Firstly there’s the inheritance; over 300 environments, many with specific mods for specific customers, and the remaining with what we call “Extended Solutions” – productized mods effectively. Then there was all the code itself, fortunately, the existing dev team has a fantastic grasp of the deep dark secrets of Git and I have enough of a basic understanding to pick up where others left off, but then, something new to myself came out, and that was the builds of the code. Debug or Release, MSBuild versions, semantic versioning, Git Flow… you get the idea. All very complicated to me at the time, but now it’s in my veins!

Previously the team used an old, unsupported, broken version of Jenkins to build their code, with definitions for each Git Branch, some with bat files, some hardcoded in the Jenkins config screen, basically a mess, and different rules for different people. Well, I like to standardize, so we scrapped the old, and brought in the new. Cue the amazing concept that is Continuous Integration. Having some basic experience with this in Azure DevOps (remember this post?) the concept was not new, however, implementing with Jenkins was a new experience. I managed to inherit a new blank Jenkins server, but first thing I did was reverse proxy via IIS and get it secured with an internal SSL certificate, as well as connect up to the Corporate Active Directory and restrict to our team only, then we got a service account from Corp IT and locked the server down, only myself and Domain Admins can get into the back end, and now we have a secured build server. Why so secure when it’s all internal? Well, that’s because the CTO office allowed us to have the corporate digital cert for code signing so long as it only existed in one, locked-down place. We can call it via the Jenkins application side but not extract, manipulate or otherwise interact with it. With this new updated (and updateable) Jenkins server, plus a couple of useful plugins (Blue Ocean is a must) we have a fantastic platform to manage and analyze our build process. The main feature we are utilizing is the Multibranch Pipelines via Jenkinsfile. This Jenkinsfile is written in groovy and is basically a set of instructions that define a build, for example, we can say build this solution, sign using that certificate and publish the artifacts so we can download afterward. The huge advantage for us is because we build a solution for multiple ERP versions, we can have up to 8 exes output at the end, and we now have one screen to grab them from, regardless of what Git branch we built for. On the subject of Git branches, due to the multibranch pipeline functionality, once our Jenkinsfile is pulled into Master, it will then filter down to all subsequent branches, and with this feature enabled, Jenkins will detect any new branches that were pushed back to the origin with that Jenkinsfile includes. I’ve previously written about VS Code and all the wondrous things it does, but we also discovered a Jenkinsfile checker in the form of https://marketplace.visualstudio.com/items?itemName=janjoerke.jenkins-pipeline-linter-connector. This tool allows us to check syntax, against our own Jenkins server and therefore ensures an accurate rule definition every time we adjust a build. It’s time-savers like these that have boosted the team’s productivity significantly, I recently tweeted about this improvement, as I took hold of an existing codebase and fully integrated it into our new philosophy within an hour or so!

A large part of the battle has been documenting the configuration and the overall process as well as educating colleagues (primarily developers) about how we are using these concepts and tools. I’ve found the majority of developers know the concepts and will have their own experiences with Git and CI/CD but until you document what the process should be in the exact circumstances, they don’t necessarily see the advantages or understand just how powerful these changes are, and more importantly how it improves consistency and productivity across the team. The improvements are already showing for us, and I fully expect that to continue!

Some reference material for a few of the things discussed in the post above:

Q1 FY20 Part 1

Back on October 1st 2019 I decided to take a leap of faith and join the dark side, this has resulted in me becoming the UK’s first dedicated QA for our Custom development teams. Effectively we develop, using our own SDK and lots of other clever tools, the things that customers would love to have, but which do not come out of the box. Having visited many customer sites in the last couple of years, I have nothing but appreciation for the quality and depth of work this team produces, and it’s absolutely my pleasure to be a part of it going forwards.

My first task, set on Day 1 was to own the migration of systems from one domain to the other. Those who have followed any of my previous posts in the last 4 years will be aware I went from small local company to global ERP vendor overnight (June 1st 2016) by way of an acquisition. Well imagine moving that small dev team’s environments into a very well protected and governed American corporate ecosystem, it was effectively sat on for 3 years, and corporate policies dictated we migrated and shutdown the old!

Deciding where to start was easy… Spend a week or so working on testing out a couple of theories, having done domain migrations previously, and work with internal IT teams to put in the relevant requests and procedures to ensure those theories are robust, scalable and secure. Three weeks in and hours had been wasted scripting out a copy and paste scenario, basically a load of PowerShell scripts to do Find/Replace style blitz across 1000s of files, 10 different ERP versions, 200+ development environments (with Databases). Only the one slight snag, even after reworking permissions and roping IT into a 3TB file copy across 2 unconnected domains…. Internally developed environmnent management tooling, which with all its bells and whistles, was not supportive of the new domain, and had hardcoded ties to the older domain’s file server, oops.

Rethink time… Plan B – the best of the lot. Copying databases is one of those things I literally wrote the manual on for Epicor ERP, so that’s easy; building Windows servers has been the last 10 years of my life, so again, sorted; that leaves my understanding of the tooling that sits in the middle, well, fortunately, my new desk backs on to the lovely chap who wrote that tool, even though he now runs our R&D division, so with a few conversations and about 8 lines of code he rebuilt it for me to work on the new environments, allowing me to fully document it as it got deployed and hey presto, a working blank set of servers ready for migrated data was born within a week; including the ability to build any version of ERP 10, using blank, demo or customer data – depending on whether it’s development or QA work, and the ability to use all the latest features and more importantly the latest development tools, by way of Chocolatey!

The next few weeks consisted of identifying what needed to be moved, and what we could spin up later on demand, the resulting list was around 120 required environments, mostly because of productised “Extended Solutions” which need to be built for each version of ERP 10 we support. But also ongoing customer projects, version uplifts, test environments for developers to test their own theories and boost their skills etc. This was a very slow and involved process, per Environment/DB it was not too bad, but in Part 2 (when I write it) I’ll go through how my domain migration project became an environment and process improvement project, featuring Git, Jenkins, CI/CD and

The good news is my Domain Migration which we scheduled to be fully complete, i.e old domain shut down for 24th December 2019, was in fact completed on 6th December 2019, so despite the slightly wasted 3 weeks of testing, scripting, and familiarisation, with all parties on board we (sadly) shutdown the Dot Net IT domain at 17:30 that evening!