A recent Infoworld column by Paul Venezia, titled “The thin line between good and bad automation,” describes the serious problems that can result when enterprises use automation scripts in an ad hoc manner rather than as part of a holistic delivery model.
Venezia makes a compelling argument that “there’s a slippery slope where cobbling together a variety of automation scripts in an effort to lighten the workload actually winds up achieving the opposite, and horrible things happen.” The “horrible things” he describes include scripts being reused for multiple and unrelated projects, resulting in massive database crashes and a “Rube Goldberg machine of unintended consequences.”
The “bad automation” Venezia describes is a true and real threat. The good news is that new and better tools are available that can help prevent these types of problems from occurring. Many IT shops are using robotic process automation (RPA) tools in dev/ops to automate a variety of processes – including build, deploy, test, provision, monitor, discover diagnose and heal – to ensure the application of best practices and to prevent cobbled-together automation scripts from falling down the slippery slope Venezia describes. When applied properly, RPA tools enable a centralized, standardized control function for creating, modifying and utilizing automations with features like audit logs, access rights and other control points to enhance visibility and ensure the application of best practices.
That said, it’s important to remember that any function or process – whether automated or manual – can wreak havoc if not properly planned, managed and integrated. As such, while RPA tools can enhance service management, they need to be applied in the context of a governance framework and process discipline that applies to each individual project and initiative as well as to the overall service delivery model. Lacking that discipline, the tool is just a tool – depending on how it’s used, it can either get the job done or make a mess of things.