NelsonHall: Process Discovery & Mining blog feed https://research.nelson-hall.com//sourcing-expertise/digital-transformation-technologies-services/process-discovery-mining/?avpage-views=blog NelsonHall's Process Discovery and Mining program is designed for organizations considering, or actively engaged in, the application of process discovery and mining technologies as part of identifying processes for automation. <![CDATA[IBM Converging Risk Scores to Optimize Cybersecurity Offering]]>

 

NelsonHall recently attended an IBM Security analyst day in London. This covered recent developments such as IBM’s acquisition of Polar Security on May 16th to support the monitoring of data across hybrid cloud estates, and watsonx developments to support the move away from rule-based security. However, a big focus of the event was the subject of risk.

For the last few years, the conversation around cybersecurity has shifted to risk, highlighting the potential holes within a resiliency posture, for example, and asking questions such as ‘if ransomware were to shut down operations for six hours, what would the implications for the business be?’

IBM and a number of other providers, therefore, have been offering ‘risk scores’ related to aspects of an organization’s IT estate. These include risk scores from IBM’s Risk Quantification Service, IBM Guardium, for risk related to the organization’s data, including its relationship with data security regulations; risk scores from IBM Verify related to particular users; and from recently acquired Randori, the company’s attack surface management solution.

Randori, acquired in June 2022, is a prime example of IBM’s strengths in understanding and reducing risks. Its two offerings, Randori Recon and Randori Attack, aim to discover how organizations are exposed to attackers and provide continuous automated red teaming of the organization’s assets.

After running discovered assets, shadow IT, and misconfigurations through Randori Attack’s red team playbooks, clients are presented with the risks through a patented ’Target Temptation’ model. In this way, organizations can prioritize the targets that are the most susceptible to attack and monitor the change in the level of risk on an ongoing basis.

IBM’s Risk Quantification service uses the NIST-certified FAIR model which decomposes risk into quantifiable components: the frequency at which an event is expected and the magnitude of the loss that is expected per event. In this manner, the service performs a top-level assessment of the client’s controls and vulnerabilities, makes assumptions such as the amount of sensitive information stolen during a breach based on prior examples, and produces a probability of loss and the costs related to that loss, including fines and judgments from regulatory bodies.

It is not the first time we have seen this model and a similar approach being taken by vendors offering cyber resiliency services. One such vendor is Unisys, who in 2018 offered its TrustCheck assessment, which used security data and X-Analytics' software to analyze the client's cyber risk posture and how they associate with financial impacts. These financial impacts were plotted against the threat likelihood of the event.

TrustCheck was used as a driver for the Unisys cybersecurity business; it related the expected loss against guidance to whether the value of securing the client's environment was greater than the cost to remediate a gap, and it conveyed this information to the C-level.

So what is the difference between IBM’s approach to risk and Unisys’ TrustCheck service?

IBM has been approaching its risk qualification from both ends – a bottom-up measuring of user, data, compliance, and the IT estate using platforms such as Guardium, Verify, and now Randori, and a top-down view within its Risk Quantification Service. At the analyst event in London, there was a clear indication that these risk scores would be looking to converge over time to provide a more accurate and consistent view of an organization’s risk. For example, using the outputs from Randori Recon to understand the client’s exposure; Guardium and Polar security to understand what data is being held and where it could travel; and Verify to understand what user access exists. A consistent, accurate view of the client’s resiliency would then be used to drive decision-making.

This convergence of risk scores will not be an immediate development. Randori has just undergone a year of development to integrate its UX into QRadar for a unified experience, and its upcoming development will include being brought into the IBM Security QRadar suite as part of an Attack Surface Management (ASM) service before a consistent risk score service is complete. Likewise, the acquisition of Polar Security needs time to bed in to the data security estate.

NelsonHall does, however, welcome any moves that result in more organizations knowing more about the risks to their business, and the financial risks associated, which has traditionally been a major stumbling block for organizations in understanding what remediation should be taken to increase security postures beyond the baseline of compliance requirements.

]]>
<![CDATA[The Future of Process Discovery & Mining: 2023 and Beyond]]>

 

Process discovery and mining platforms, which examine organizations’ process data as part of transformation initiatives, have become an increasingly critical part of process automation and reengineering journeys.

Here I look at what to expect in the process discovery and mining space in 2023 and beyond.

Continuous Monitoring

Traditionally we have seen COEs use process discovery and mining to focus solely on single point-in-time process improvements as part of the transformation journey. Because of this, when an improved process is put into place the focus (and licenses) for the process discovery and mining suites are moved onto the next project.

In 2023, we predict that these solutions will be used to support more process analysis on an ongoing basis, with licenses being used on already reengineered processes to support KPI monitoring and ongoing process improvement. There are certainly features being built into the platforms and pricing models that are reenforcing this move, such as being able to use process discovery platforms to train users on parts of processes that cannot be automated, and unlimited usage licenses that aren’t tied to the amount of users or the amount of process data ingested.

In this way, process discovery and mining solutions can provide a real-time view of actual process performance, augmenting business process management (BPM) platforms.

Automation and Low Code Application Development Links

Process discovery and mining have always been great lead-ins for automation, revealing what processes are in place before automating them; however, that connection has mainly been one way, i.e. process discovery and mining platforms sending over the skeletons of a process to automation platforms to build an automation.

In 2023, we see process discovery platforms implementing more functionality in the reverse direction to take automation logs from the automation tools back into the discovered process to track the overall performance of the process on an ongoing basis, whether the steps of the process are automated or performed by a human.

Likewise, when a process cannot be fully automated and requires human effort, automation platforms are implementing low code applications to collect the necessary information. We envisage the process discovery and mining platforms not only building skeletons of the processes for automation, but also building suggestions from low code applications.

Digital Process Twins

Usually when we refer to digital twins we are talking about a digital representation of a manual process enabled by IoT as part of industry 4.0. However, at the end of last year, we saw one or two vendors moving towards the creation of digital process twins for business operations.

The digital process twin is the culmination of continuous monitoring of both a process and its automation. Using these features, process understanding solutions can be the future of BPM, providing real-time tracking of the performance of a process, and they can enable opportunities like preventative maintenance, leveraging root cause analysis to find when a process is showing signs that it is straying from the target model.

Object-Centric Process Mining

Traditional process mining ascribes a single case notation for every step of the process, but this isn’t the best fit for every process.

For example, in car manufacturing, the production of a car could be held up by the materials for the glass in the windows if single case notation is used. A car manufacturer will not be ascribing the same case notation for silicon arriving at the factory to the car that will eventually be fitted with windows made from that silicon consignment.

In object-centric process mining (OCPM) you would not use a single case notation as the only linking piece in a process. Instead, the case notation ceases to be the be-all-and-end-all of the process and each object, each aspect of the process, is tracked individually, with its own attributes, as part of a whole process.

The object-centric process could then, in the car example, relate case numbers from user issues to the vehicle, to the order they placed, to the windscreen, and to the delivery of silicon.

Such OCPM will expand the usefulness of process discovery and mining from processes that are fairly simple and related to a single case, such as a ticket number on an email, to a more complete view of the process.

***

In this quick look at the future of process discovery and mining, we acknowledge that the features described here may not be the core application of these platforms before the end of 2023, as organizations will continue to use existing functionality to target the bulk of legacy processes requiring quick fixes to reduce costs and perform one-off improvements to processes.

]]>