NelsonHall: Big Data & Analytics blog feed https://research.nelson-hall.com//sourcing-expertise/it-services/big-data-analytics/?avpage-views=blog Insightful Analysis to Drive Your Big Data Services Strategy. NelsonHall's Big Data & Analytics Services Program is a dedicated service for organizations evaluating, or actively engaged in, the outsourcing of all or part of their IT activities. <![CDATA[Infosys’ Analytics Practice Aims to Bridge Business Needs & AI-Related Technical Services]]>

 

We recently talked to Infosys about its analytics and big data capabilities in its Data and Analytics (DNA) practice.

DNA is a significant practice within Infosys, which we estimate represents ~8% of Infosys’ headcount. It continues to enhance its portfolio, expanding from technical services to business services, and segments its portfolio around three themes: Modernize, Monetize, and Network.

Shifting the portfolio from technology services to new business models

Modernize offerings cover the IT side of analytics and big data, with most activities centering around big data, data lakes EDW and cloud (for both cloud hosting and PaaS), taking a brownfield approach and making use of existing client investments. DNA highlights that client demand is driven by use cases.

Under the Monetize umbrella, DNA is using analytics to help clients improve operations to grow revenues, drive operation efficiency and meet GDPR compliance. Most Monetize projects are based on more widespread use of analytics.

With its Network offerings, DNA supports clients in launching new services based on data, from both internal and external sources.

As part of this portfolio, DNA has launched its Infosys Data Marketplace (IDM) IP and services and is helping a medical device firm develop new health services using IoT-based data such as nutritional and fitness data.

With IDM, DNA highlights that it wants to democratize business models based on data.

Continued push in portfolio verticalization

In parallel, DNA continues to specialize its services and has structured its specialization effort around two areas: verticalization in the form of solutions & blueprints and AI.

Verticalization continues to be a priority, with DNA having created multiple software-based solutions for the different sectors it covers. DNA drives its verticalization effort with Infosys’ vertical units and produces either standalone accelerators or solutions (e.g. inventory optimization, supply chain early warnings) or embeds analytics within a larger Infosys platform (e.g. within the Finacle Universal Banking Solution).

A recent example of a standalone analytics solution is its Healthcare Analytics Platform, targeting the needs of service providers and bringing analytics to hospitals around members – e.g. in the field of precision medicine, disease analysis and utilization rates.

Investing in AI use cases

AI continues to be a significant driving force behind Infosys overall and DNA in particular.

An example of recent investment is Digital Brain for its HR needs, to match the needs of projects for specific skill profiles and identifying corresponding employees, along with selecting training sessions to help Infosys’ personnel to acquire digital skills, specifically for upcoming projects.

Infosys has positioned Digital Brain as part of its Live Enterprise initiative and is one of the cornerstones of how Infosys is becoming a digital enterprise.

In parallel, DNA is systematically mapping uses for AI. A core element of DNA’s AI portfolio addresses uses cases such as fraud management, product recommendation engines, and chatbots. Increasingly, DNA has worked with clients on image- and video-based recognition and has developed accelerators that include driver drowsiness detection (automotive), damage detection (insurance), and queue counter/customer wait line (retail).

To favor the spreading of AI across clients, DNA is pushing the notion of AI pods with the intent of making its clients more aware of AI possibilities. The company has structured AI pods in several forms, whether client-dedicated or working for several clients, and focused on AI technologies such as video analytics or specific use cases.

Analytics becoming more business-oriented

Looking ahead, outside of the IT department, clients are asking for further analytics across their operations, pushing their IT function to democratize analytics tools. Next-gen dashboards such as data visualization tools are helping here but are only the beginning of the answer. We expect vendors will further invest in this area with more vertically-aligned and non-technical user-friendly dashboards.

IT departments are still facing challenges of complex technologies and data migration to data lakes and big data infrastructures. Vendors are reducing this complexity by building platforms to collect and clean data and run analytics. Their next challenge is around AI: AI brings a new level of complexity that is further constrained by the small number of AI specialists globally. We are starting to see vendors investing in making the creation of AI-based algorithms more accessible to non-specialists.

Expect DNA to invest more in making big data and AI technology usage more accessible to non-specialists, while continuing to work with their business groups to make analytics more relevant to business needs.

]]>
<![CDATA[DXC Supports BMW in its Autonomous Vehicle Journey]]>

 

We recently talked to DXC Technology’s Analytics business line about its work with German premium car OEM BMW, its positioning in autonomous vehicles, and its differentiation strategy.

BMW recently provided some light on the dynamics behind its Vision iNEXT autonomous car due to launch in late 2021. Vision iNEXT, BMW’s first autonomous vehicle, will offer Level 3 autonomous driving as an option. Drivers of Vision iNEXT will be able to surrender control of their vehicles for “long periods” of time and at speed of up to 80 mph.

To support its Vision iNEXT program, BMW started work on its High Performance D3 (Data-Driven Development) program two years ago. D3 reflects that the program is based on the collection of vast amounts of data. BMW is gathering data from its fleet of 80 test 7 Series cars operating in the U.S. west coast, Germany, Israel, and China. BMW is planning to ramp up the number of vehicles in the test fleet to ~140 by the end of 2019, looking to capture vast amounts of data to understand a wide number of traffic scenarios.

The project is about scale. Through its fleet of test vehicles, BMW believes it will have in-car and out-of-the-car sensor data collected across 3.7m miles. However, not all data is relevant and BMW believes it will need to extract 1.5m miles’ worth of data out of the 3.7m. Thus, BMW will need to complement in-field data with simulation-based synthetic data, representing the data collected by autonomous vehicles across 150m miles. The required IT infrastructure shows the scale of the project and requires a storage capacity of 320 petabytes. Over time, BMW will need to reprocess the data and this will demand massive computing power.

This is where DXC Analytics is helping: the unit has been involved at the IT infrastructure and data lake level and designed and deployed the big data solution close to the headquarters of BMW in Munich. To support the project, DXC has used Robotic Drive, a big data platform it has customized to the specific needs of the automotive industry. Robotic Drive combines open source software (e.g. the Apache Hadoop ecosystem of software tools) and several accelerators (e.g. analysis of data in vehicle native format, reference architectures relying on clusters for AI training purposes).

From DXC’s perspective, Robotic Drive, which the company provides as part of its services, is important in differentiating its autonomous vehicle service portfolio. DXC wants to address the high demand for data analytics. As a result, DXC Analytics is focusing on commercial expansion, with Robotic Drive having several clients in Europe, and DXC Analytics is now looking to expand in the U.S. through the creation of a CoE. The unit is also investing in its sales force, recruiting pre-sales consultants, and ramping up its factory-based delivery presence in India and Poland, as well as exploring the Philippines and Bulgaria.

Internal collaboration will also play a role: DXC Analytics is increasingly working with other DXC units, notably around IoT and DXC’s Enterprise Cloud Applications units. An example of joint work is around SAP HANA. Another key event that should accelerate the growth and expand the capabilities of DXC Analytics is Luxoft, whose acquisition has just been finalized.

Luxoft will help expand DXC Technology’s automotive-specific offerings towards digital cockpits, autonomous driving and ADAS. With Luxoft, DXC Analytics gains technical and business expertise. This should help DXC Analytics expand from its big data positioning and gain a stronger footprint on the data science and algorithm side.

]]>
<![CDATA[Sopra Steria: Building on Big Data & Analytics Initiatives]]>

 

We recently talked with Sopra Steria about its work and capabilities around big data and analytics. The company has created a Data Science CoE (or Technology Stream in Sopra Steria’s terminology), which brings specialized services and expertise to the various in-country systems integration (SI) business units, focusing on its key accounts and bringing vertical knowledge.

Sopra Steria’s Data Science unit has been developing AI-based use cases, focusing on the analysis of unstructured data, including through the use of computer vision technologies.

Applying AI: client examples

An example of its work in applying AI algorithms to unstructured data is an aircraft manufacturer client that Sopra Steria has been helping by developing solutions to automate the inspection of aircraft using pictures taken by drones. This approach, which is much faster than manual inspections, uses drones to take pictures which are stored and compared in real-time with a repository of pictures showing anomalies.

The process started with the drone and AI identifying simple items such as missing paint or screws on the aircraft, and it is now getting more complicated as Sopra Steria grows its expertise. Sopra Steria estimates that it requires ~ 200 pictures to teach the ML algorithm to spot anomalies, and believes its approach is now mature enough to be applied to similar projects with other clients.

Another example is a project based on the use of satellite images. Sopra Steria has helped an electricity grid operator to analyze its network and identify where it needs to prune trees, and prioritizing them. Unlike the aircraft example, this approach does not rely on edge-based computing, as flying drones in areas with many trees is a challenge. The broad principles are the same, however, and the approach helps prioritizing trees which are mostly likely to create interferences in the electricity grid.

Creating IP from expertise

Looking ahead, Sopra Steria’s Data Science unit wants to create IP out of its expertise. The CoE acknowledges it is walking a fine line between AI cloud vendors that tend to offer vertical-agnostic micro-services and ISVs that are high specialized (e.g. Sensetime and Ntechlab in China and Russia respectively, both around video surveillance). The unit is adopting two main approaches:

  • A methodology approach for use case development. For example, in drone-based aircraft inspection, it knows what images of anomalies it needs to look at to optimize the learning process of the ML. And for email automation around language utterance, the unit created a repository of terms and jargon specific to the insurance industry
  • An internal focus. The unit has been taking part in Sopra Steria’s Inner Source approach (see blog here) and is creating AI and ML micro-services that it wants its developers to use. Indeed, it is finding that its software developers have an appetite for using AI/ML micro-services it has created. The CoE is now acting as a support organization for using these micro-services and applying them to projects. We view this approach as a positive step in Sopra Steria’s evolving IP strategy. While Sopra Steria has been investing in commercial software products (e.g. Sopra HR Software, Sopra Banking Software and real estate), the firm’s SI units have been less vocal about their IP creation. This is now changing, initially driven by Inner Source to provide software developers with the software tools and environments they require. Sopra Steria is now accelerating its IP approach.
]]>
<![CDATA[NIIT Technologies Adapts Data & Analytics Around Big Data & AI]]>

 

In a recent meeting with NIIT Technologies, we discussed how its Data & Analytics offering has been adapting, with a focus on developments around big data and AI. Here are the main talking points.

Overcoming tool fragmentation

One priority has been NIIT Tech’s Xpress Suite, one of Data & Analytics’ most popular offerings, which seeks to overcome the fragmentation of big data and analytics software tools.

Analytics & Data has taken a platform approach, relying on a standard architecture and pre-integration across software tools, using mostly open source tools. The approach is brownfield and Analytics & Data will integrate Xpress Suite with the client’s existing software investments.

Xpress Suite includes a series of accelerators around four main use cases: migration of data to the cloud, MDM, data lakes/big data, and insights/AI. The most popular to date have been around data migration to the cloud and data lakes/big data.

Data Lab as ideation and prototype centers

Analytics & Data’s second most popular offering is Data Lab, a set of virtual centers used for conducting ideation with clients, identifying use cases, and creating PoCs, within six weeks. The nature of projects varies significantly, with 360 customer data projects and social analytics being recurring themes.

Developing industry relevant use cases

Data & Analytics has also been working on creating industry-specific analytics use cases around customer data. The practice has adopted a start-up-like ‘fail fast’ approach and set up a dedicated team, Incubation Services, that creates reusable use cases, with ML playing an important role in these. The team talks with NIIT Tech’s industry units to identify appropriate use cases.

Development of each use case (not a product) takes around four to six weeks. Where there is client take-up, Analytics & Data will then customize it to that client’s specific needs. Most of these use cases are in the travel & transportation industry, with others in insurance and BFS.

Data & Analytics is also bringing its capabilities to enhance the analytics functionality of the industry-specific software products that NIIT Tech has created or acquired. Two examples are:

  • Airline sector revenue optimization software
  • Insurance sector products for underwiring and financial reporting.

The range of services and scope of projects varies extensively across technologies.

Example clients are:

  • A travel company that wanted to monitor the performance of its sales teams and marketing operations across geographies and across brands. Data & Analytics helped created a data lake for the company’s sales data, using its Data Lake Xpress IP, and then deployed analytics. The PoC lasted six weeks and the implementation about six months. The client now has a modernized platform integrating data from various sources, providing better visibility of its sales, customers and marketing activities
  • A U.S. insurance firm, where Data & Analytics was involved in a project for image recognition of property damage, based on images taken by the client’s property damage assessors, and using AI to process the images and identify the nature and extent of the damage
  • A European wealth management firm set up a data lake using NIIT Tech’s DLXpress IP. The solution ingests financial data, masks it for data security purposes, and provides analytics. It is used by portfolio managers for their fund allocation needs, to understand and explain their investment strategies and risk exposure.

Data & Analytics highlights that it has set up an effective structure and intends to continue investing in use cases via its ‘fail fast’ approach.

]]>
<![CDATA[DXC Bets $2bn on Recovery of Luxoft to Scale its Digital Capabilities]]>

Yesterday morning, DXC announced its intended acquisition of Luxoft in an all cash transaction of $59 per share, around $2bn. This represents a 48% premium over Luxoft’s average closing share price over the previous ninety days (and ~86% premium on Friday’s closing price). The deal is expected to close by end June 2019.

In recent years DXC (including as CSC) has made a number of acquisitions that have expanded its ServiceNow, Microsoft Dynamics, and recently Salesforce capabilities and formed the bedrock of its Enterprise & Cloud Apps (ECA) practices. This is different: the Luxoft transaction is closer in feel to its 2016 acquisition of Xchanging, which brought in Insurance sector capabilities, or the more recent acquisition in the U.S. of Molina Medicaid Solutions. In all three cases, DXC is acquiring a company that has specific issues and challenges but that also expands DXC’s own industry capabilities; Luxoft will in addition expand DXC’s capabilities around Agile/DevOps.

Luxoft is a company in transformation

With revenues of $907m in FY18 (the year ended March 31, 2018) and nearly 13k personnel, Luxoft is a mid-sized firm. DXC is presenting Luxoft as a “digital innovator”, but it is a company that is grappling with significant client-specific and market challenges. Until FY17, it was highly successful, enjoying revenue growth in the range of 20% to 30%. FY18 saw a slowdown, still to a very solid level of 15.4% (of which we estimate ~7% organic), but FY19 has seen flat growth.

In particular, Luxoft has been hit hard by its dependency on the investment banking/capital markets sector, in particular on two clients: UBS and Deutsche Bank. Back in FY15 they accounted for over 56% of Luxoft’s total revenues (~$294m). Since then, Luxoft has been growing its share of wallet in other key accounts, and the combined revenues from clients 3 to 10 have increased from $123m in FT15 to ~$208m in FY18, a CAGR of ~19%, with clients 5 to 10 growing at nearly 30%. In FY19 Luxoft is expecting around 13% revenue growth from these accounts (to, we estimate, ~$235m).

But while it has been very strong growth in its other top 10 accounts, Luxoft has since FY18 been impacted by declining revenues at both UBS and Deutsche Bank (the later by 13.4%). H1 FY19 saw a 11% y/y decline and these two accounts now account for just over 30% of total revenues. Both have been insourcing some talent. While Luxoft believes that the UBS account is now stabilizing, Deutsche Bank is more challenged, and the account remains an issue: revenues are likely to decline by ~44% in FY19 to ~$90m, or <10% of total revenue, with a further contraction in FY20.

Outside these two, Credit Suisse is also a major client and Luxoft is clearly exposed to the slowdown in the European capital markets/investment banking sector. But elsewhere in financial services, there are much stronger opportunities in the near-term in the wealth and asset management sector, particularly in the U.S. and there is the potential for DXC to help Luxoft expand its presence in the Australian banking sector.

Luxoft has been looking to diversify its sector capabilities in recent years, in particular beefing up its offerings to the automotive sector, developing relationships, mostly in Europe, with tier-one OEMs and suppliers such as Daimler, Continental, and Valeo. Automotive & Transport is a hyper growth business for Luxoft, delivering nearly 43% growth in FY18, but for a company the size of DXC, this is a small business it is picking up: FY18 revenues were $158m. (FY19 revenues are likely be ~$220m, boosted by Luxoft’s acquisition of embedded software specialist Objective Software, which has brought in some U.S. client relationships. Some of these are large accounts (four of the top 10 accounts are in the automotive sector.  And one is a common account to both DXC and Luxoft.

In its Digital Enterprise unit, which is servicing all other verticals, Luxoft has been driving its offerings to more digital offerings, at the same time looking to reduce its exposure to low-margin work. Revenue performance in the Digital Enterprise Unit has been erratic with a strong performance in FY18 followed by a 13% decline in H1 FY19 though Luxoft claims to be confident that it has completed the transformation of the unit.

In brief, among the capabilities that Luxoft will bring to DXC we see:

  • Significant agile development capabilities, enhancing DXC’s application services business.
  • Some analytics capabilities
  • Some product engineering services capabilities in the automotive sector, plus some experience in IoT-centric projects
  • Offerings around UX design (in June 2018, Luxoft acquired Seattle-based design agency Smashing Ideas from Penguin Random House).

Luxoft has also been developing its capabilities in blockchain, an area where we suspect DXC has little experience, with pilots in the healthcare, government (evolving in Switzerland) and automotive sectors.

And, of course, Luxoft has a sizeable nearshore delivery capability in Eastern Europe. Luxoft’s delivery network has its roots in Ukraine and Russia. In reaction to the 2014 Ukraine-Russia crisis, the company initiated its Global Upgrade program with the intent of de-risking its profile and increasing its presence in other nearshore locations, in particular in Romania and Poland. Since FY14, Luxoft has decreased its headcount in Ukraine from 3.6k to 3.1k and in Russia headcount from 2.3k to 1.9k.  In parallel, Luxoft has significantly increased its presence onshore with now 1k personnel in North America and made its delivery network far less risky for clients. DXC highlights that it will be able to help Luxoft scale its delivery footprint in The Americas and India.

DXC is betting Luxoft will help accelerate its topline growth

While Luxoft has been grappling with declining margins – partly, but not solely due to the declines at Deutsche Bank and pricing pressures in other accounts – DXC is emphasizing the topline opportunities, rather than cost synergies. Given DXC’s track record in stripping out costs, we imagine Luxoft employees will be glad to hear this.

DXC is targeting revenue growth from:

  • Luxoft achieving 15% revenue growth over the next three years
  • Revenue synergies of $300 to $400m over this period, representing 1% to 2% of additional revenue growth for DXC

To achieve this, DXC is looking to cross-sell, for example, the:

  • Product engineering capabilities of Luxoft to North American and Asian automotive clients and other sectors, e.g., high-tech, manufacturing and healthcare in priority
  • Digital capabilities of Luxoft into DXC’s client base. DXC claims that all of Luxoft’s business is, by its definition, digital, thus adding nearly $1bn in revenues to DXC's own $4bn digital business, and expects to grow this $5bn business by another 20% annually
  • Managed cloud and digital workplace capabilities of DXC into the Luxoft base (where, however, there are typically well entrenched incumbents).

DXC is also looking to broaden the use of Luxoft assets, taking FS and automotive capabilities and applying these to industries where Luxoft has not historically had a large presence. As an example, Luxoft has developed data visualization assets for FS clients, capabilities it believes that could be applied to other sectors.

How will DXC and Luxoft Integrate?

One key question is how DXC will manage the integration. In the short term at least, Luxoft will remain an independent company, retaining its brand and senior leadership (DXC intends to have retention plans in place for key Luxoft execs). For DXC to ultimately position as an end-to-end and global IT services organization, able to offer clients a full spectrum of services ranging from digital transformation advisory and concept testing through to IT modernization in all its key geographies and target markets, there will need to at least appear to be an integrated go-to-market and also a standardized global delivery operation that leverage this newly acquired assets.

David McIntire, Dominique Raviart, Rachael Stormonth

]]>
<![CDATA[Highlights from IBM & The Weather Company at Red Bull Racing (vlog)]]>

 

Mike Smart presents summary highlights from IBM and The Weather Company at Red Bull Racing.

]]>
<![CDATA[Atos Strengthens Key Partnerships, Launches New AI Products]]>

 

NelsonHall recently attended Atos’ Technology Day in Paris, where we heard updates on its digital technology initiatives, including key partnerships with Siemens and Google, and new AI product developments.

Extending partnership with Siemens

Atos gave an update on its ongoing investment with Siemens, a seven-year-old partnership that was recently extended to 2020 with the addition of an extra €100m in funding (pushing the total amount of funding to €330m). The partnership focuses on data analytics and AI, IoT and connectivity services using the MindSphere platform, cybersecurity, and digital services.

Also, we were encouraged by what appeared to be a renewed interest in integrating the platform by Siemens, with Siemens now partnering with Infosys to develop applications and services for MindSphere. Developing the MindSphere platform and growing the business beyond Siemens will be a critical factor for growth in Atos’ IoT business.

Google partnership update

Atos presented on its partnership with Google, announced in April 2018, framing the partnership around AI. Within the partnership scope, Atos’ initiatives are to:

  • Develop its Canopy Orchestrated Hybrid Cloud to integrate Google Cloud platform (which will become Atos’ preferred cloud computing choice)
  • Create a ML practice using Google Cloud’s ML APIs, and develop vertical-specific algorithms
  • Use G Suite as part of its Digital Workplace offerings.

The first Atos/Google co-lab has now been opened, with two more set to follow: in Boulogne, Paris (opening in fall 2018) and Dallas, TX. Each of these labs have 50 specialists, 25 from Atos, 25 from Google.

AI product developments

Atos announced the release of its Codex AI product suite. The suite aims to act as a workbench for client development on AI and the management of AI applications. The advantage of Atos’ suite is its ability to push and manage the AI algorithms across a number of architectures in an infrastructure-agnostic manner, which we found particularly interesting with regard to edge computing.

Edge computing has a multitude of applications for IoT services. Even now the amount of data sent by IoT devices can strain the ability to communicate data back for analytic model development, and in some cases data just cannot be communicated in a timely manner. Edge computing aims to reduce the amount of data that needs to be sent ‘home’ by providing some computing power for running algorithms at the device end. Linking back to the Codex AI suite, a key selling point of the edge computing box will be the ability to have models developed at HPC stacks, then use algorithms developed on the suite to be run at the edge.

Other future-looking announcements included the release of an updated version of Atos’ Quantum Learning Machine (QLM) and a prototype of hardware for an edge computing box.

Comparing these announcements to 2016’s ‘2019’ ambition plan, it's clear to see Atos has the potential to overperform on Codex expectations. The Codex business, which encompasses business-driven analytics, IoT services, and solutions including MindSphere and now the Codex AI suite, had a revenue target of €1bn by 2019, up from ~€500m in 2016. Codex experienced very significant growth in 2017, delivering revenues of ~€760m, in part due to its acquisition of zData in the U.S. We expect to see further acquisitions in the Codex business, as projects from MindSphere and the releases announced at this event ramp up.

 

In sum, the strengthening of key partnerships, AI product developments, and the slow build of the Quantum business, establish a foundation for the long-term growth of Atos’ digital and data-orientated business.

]]>
<![CDATA[L&T Infotech’s Assurance Platform: Identifying Automation Gaps In Data-Related Testing]]>

 

NelsonHall has been advocating for some time that software testing services vendors should focus on automating full testing processes or testing functions, and no longer be solely focused on standalone accelerators/point solutions. This is commonly called a ‘platform’ approach, and we define platforms as pre-integrated/interfaced COTS, accelerators and open source software. The creation of testing platforms is still work in progress, with most of the work in the past two years being around DevOps platforms.

The good news is that testing services vendors are expanding their efforts from DevOps to other testing areas, driven by demand for platforms in digital, in test support activities, and in non-functional testing. More about platforms here. With this in mind, L&T Infotech (LTI) briefed NelsonHall recently on two main automation initiatives it is leading: around data-related activities, and around digital. Here I take a quick look at LTI’s data-related automation initiative.

So what’s in LTI’s Assurance platform for data testing? There are four main IPs mostly around analytics and big data, and a lot of the activity is to check consistency of data across data sources and targets. Data Testing Framework (DTF) focuses on testing data that has migrated from one EDW to another. LTI also has a BI-specific tool, OLAP Comparator, for validating data across dimensions/tables.

And then there are the challenges brought by big data and its many different data structures. With its automated testing solution BigTest, LTI focuses on several scenarios, e.g. with sources including flat files, RDBMS, and no-SQL to Hadoop.

Apart from the above IPs, LTI also has a number of accelerators mostly related to report and file comparisons, e.g. pdf, xls and xml files.

Outside of data analytics, but still in the field of data, LTI has its ETDM IP, focusing on automated test data management. This is important: test data management is a key element of test support services, along with service virtualization and environment management, in providing career testers with the required tools to proceed to testing. The key features of ETDM are:

  • Data sub-setting (subsets the data from production to test environment)
  • Data masking (masking clients’ sensitive data using pre-defined rules and maintaining data integrity)
  • Compliance testing (for identifying data that needs to be masked and meeting data security standards such as PCI-DSS
  • Synthetic test data generation (with pre-defined rules for various data types).

At this point, LTI’s Assurance platform is more a series of standalone accelerators around two main themes – analytics & big data, and test data management – although with a common look and feel. However, the good news is that LTI has worked on identifying automation gaps across those themes, and the company is continuing its efforts in automating data-related functionality gaps.

We will be commenting in a separate blog on the digital IP that LTI recently introduced.

]]>