Review of the FSB High level Recommendations
[real3dflipbook pdf="https://www.bizzboard.com/wp-conte...
Deloitte and SAS have been closely collaborating for decades to make emerging trends in technology and analytics a practical reality for clients around the world. The
combination of Deloitte’s hands-on, industry-level experience and SAS’ world-leading analytics capabilities help deliver stronger outcomes for businesses everywhere.
Each year, Deloitte’s Tech Trends report sets the standard for forward-thinking insights on what’s next in technology. Together, we put tomorrow’s technology trends
to work today. Explore them below.
Wondering if all this will really happen?
Just watch.
If you’re reading this, you already know why analytics is crucial to
These are real challenges that cannot and should not be ignored.
gleaning the full business value out of any technology implementation.
But don’t let them blind you to the possibilities presented by Deloitte’s latest
Whether it qualifies as a trend or not, data is pulsing through these
technology trends. Consider that only a few years ago, the idea of running
technologies at volumes that are – all hype aside – unprecedented.
analytics in the cloud qualified as an out-of-reach trend. Now, cloud services
And the volume is only growing as more advances unfold.
arrive with advanced analytics capabilities built in, requiring minimal coding
Analytics is how organizations make connections between all this
data, identify useful or insightful patterns, and apply the resulting
insights into their business.
At least that’s how it goes in theory.
In reality, a lot of leaders are struggling – not because they don’t understand
what analytics can deliver – because they’re running
into obstacles. Many leaders don’t have the budgets or talent to
deliver on existing analytics strategies, much less tackle more
innovative, forward-looking opportunities.
or integration on the part of the user. It just works – because a wide range of
providers saw the opportunity and made it happen.
Do all of these trends “just work” today? No – although aspects of them do.
But we’re working on that. Analytics capabilities delivered in a managed
services model are already making it easy to bring some early phase aspects
of these trends to life. As is the Deloitte-SAS Center of Excellence, where
we facilitate important conversations about these trends between our two
organizations in collaboration with our clients.
It’s happening – and we’re thrilled to be working together to translate these
trends into business value and tangible outcomes.
Others are stuck in the data governance/data quality/data prep
loop, making sure their data house is in order before inviting
everyone into the big analytics and AI party.
Nat D’Ercole
Omnia AI Data Transformation and Ecocsystems & Alliances Leader
Partner, Deloitte
SAS PERSPECTIVE
Data sharing made easy
Data sharing made easy
Analytics and the trust imperative
The real barrier to large-scale data sharing among
organizations – one of the key trends identified in Deloitte’s
most recent Deloitte Tech Trends report – isn’t the technology
required to enable such sharing. Nor is it finding enough useful
data to feed this type of unprecedented collaboration.
The primary obstacle? Trust. Cultivating, managing and
sustaining trust over the long term, enabling organizations
with adjacent or even competitive aims to make use of shared
stockpiles of data is the most important prerequisite to the
type and level of strategic data sharing outlined by Deloitte.
In fact, given rapid recent advances in analytics, AI, data
management and other key technology enablers, an inability
to create the conditions for trust is likely the only reason data
sharing is considered a forward-looking trend rather than an
everyday reality today.
At the same time, data sharing is already happening in
some important areas. In response to the emergence of
COVID-19, an array of government agencies, pharmaceutical
firms and medical care providers quickly mobilized to share
data about the virus and its impact in order to inform vaccine
development and to develop and launch prevention protocols.
This type of collaboration was undertaken in the context of an
unprecedented, society-level challenge – the normal rules of
engagement changed in an instant. But data sharing can and
should be happening in smaller, more focused ways across
industries today.
While analytics capabilities have a clear role in making sense of
shared data, there are some less obvious ways in which analytics
principles and strategies can contribute directly to the trust
Data sharing made easy
that is required for multiple organizations to share their data.
Metadata
Whether your organization is gearing up for a new data sharing
When was this data set created? Based on what inputs? Is it the
arrangement or simply considering it, here are three ways in
right type of data for my purposes, and can I trust it? When two
which your analytics infrastructure can directly contribute to the
or more organizations agree to share data, these are the types
trust that is needed to move ahead with confidence.
of questions that each partner is likely first to ask about the
Model sharing
Imagine a scenario in which a research hospital quickly
develops a promising analytical model for oncology treatment,
but doesn’t have the capacity to run the model at the scale
required to show the desired results. Other organizations in the
research community could provide that scale, with the hopes of
uncovering insights that could improve patient outcomes faster.
Can the hospital share the model? Can they trust the results of
others’ model deployments – and can the other institutions trust
the model itself? Models developed with analytics systems that
come with built-in due diligence – that are validated and certified
data they’ve been provided. They’re the same questions they
ask about their own data – but it can be difficult to assess data
quality and relevance without ownership. In these scenarios,
metadata attributes can be built into the analytics approach,
giving each partner valuable insights into large volumes of
data that would otherwise present a daunting, time- and
resource-intensive challenge in vetting it to determine whether
it is trustworthy. Metadata strategies also make it easier for
partners to quickly assess which tranches of data are useful for
their purposes – and which are irrelevant – using a common
language shared between partners and their systems.
to be reliable and trustworthy – make it easier to facilitate
By developing their own metadata standards, individual
efficient model sharing among different organizations, each
industries can accelerate the application of metadata as well as
of which can see how the model was constructed and why it is
create the condition of trust that is required for successful data
generating certain outcomes. This combination of data ops and
sharing. Today, industries such as healthcare are awash in such
model ops is likely to become the standard as model sharing
standards, while others are lagging far behind.
becomes a more widespread practice.
TRANSFORMATION TAKES TRUST
Synthetic data creation
Sometimes potential data-sharing partners encounter obstacles
that may at first seem insurmountable. The data they need
may be unavailable, for example. Maybe it’s available, but too
Data sharing has the potential to usher in a new era of
more powerful, rapid advances across industries. But
this transformation is not likely to take hold at a massive
costly to acquire. Or it’s so private and sensitive that partners are
scale, all at once – it will more likely unfold in a piecemeal
unwilling to share. Sometimes the available data is full of gaps –
fashion, led by industries and sectors where there is a
underrepresented segments or conditions.
strong, sometimes unavoidable imperative for change. This
Data-sharing partners using synthetic data are able to use the
is exactly the scenario that led governments, public health
essential attributes of real data without compromising security or
agencies, universities and medical providers to collaborate at
confidentiality, as long as the synthetic data is representative of
the data level in response to the onset of COVID-19, in ways
reality, and was not generated in a use-case-specific manner. The
original data remains in the original data centers, while partners
are free to use the synthetic versions to feed a range of analytical
models. At an aggregate level, the resulting insights are accurate
and useful.
Advanced analytics tools will increasingly come with the ability
to generate and analyze synthetic data in a way that generates
valuable insights without compromising data security.
that were unthinkable beforehand.
This trust-enabled transformation will not happen without
the active, focused efforts of technology providers who
understand the value of data sharing to their users and
begin building data sharing tools and capabilities into their
solutions. For those that have not already begun doing so,
now is the time.
SAS PERSPECTIVE
Cloud goes vertical
Cloud goes vertical
Analytics implications of an industry-focused cloud
In its 2022 Tech Trends report, Deloitte identifies the rise of
clouds tailored to specific industry verticals as a development
likely to shape the future of business and technology.
Here at SAS, we see plenty of evidence of this trend already
at work. We’ve been coordinating with the leading cloud
providers to develop and deploy industry-focused analytics
and AI capabilities for years. Today we are finally seeing some
Open data standards
of these capabilities reach the marketplace as cloud adoption
Industry clouds will reach their full potential only when
further expands.
industries are operating with a shared set of data standards –
a development that is already unfolding in pockets of the
The implications of this development on data and analytics
financial services and health care industries. In health care,
practices are significant. The impact on industry solutions
these standards – Fast Healthcare Interoperability Resources,
is equally powerful, as organizations are now able to take
or FHIR standards – are emerging from broader industry
advantage of industry-tailored analytics that are delivered
trends pushing in the direction of standardization, and are
as part of a broader cloud package. This gives users the ability
managed by an independent industry organization. With
to accelerate the deployment of analytics capabilities while
disparate health care organizations adhering to the same data
avoiding many of the technical challenges that might otherwise
standards, it’s easier for cloud providers to develop and deploy
stand in their way.
analytics models in their cloud environments. These models
Several data- and analytics-related developments are likely to
unfold on the way to industry clouds. Here’s what to prepare
for in the coming years.
can then be adopted and further customized by a wide range
of organizations in the same industry – an obvious benefit.
Cloud goes vertical
But these standards introduce a host of less obvious benefits
Standardized/off-the-shelf analytics algorithms
as well. For example, FHIR standards allow users to securely
At the industry level, organizations share many of the same or
access only the specific information they need, when they need
similar needs for insights. For example, all hospitals need the
it. While a health payer has multiple clinical and operational data
ability to anticipate resource needs in the short and long term
systems, with FHIR their analytics teams can select the specific
– a relatively simple analytics task that can easily be adapted to
patient or claims data needed for business insight without having
individual organizations using a generic cloud-based capability.
to bring in all the data – saving time and resources.
Industry-oriented clouds will serve these types of needs with
As-needed (rather than always-on) access to
analytics capabilities
For many business processes, constant monitoring and analysis
is not required to produce useful business insight. But many
organizations rely on expansive data environments that require
them to constantly host, monitor and analyze all the data – not
foundation-level analytics tools that make it possible for the
organization to flex its analytics muscles in other, higher-value
areas. Expect industry clouds to have the greatest early impact in
operational areas such as workflow management before taking
hold in more specialized, complex parts of the organization.
just the parts they need – in order to answer focused questions.
The scramble for innovation and
competitive differentiation
Their data environments are neither modularized nor particularly
As organizations within industries benefit from a shared set
efficient – all the data must be up and running all the time to
of analytics tools, they will face new pressure to distinguish
accommodate queries.
themselves on their ability to generate unique insights. The
Industry cloud environments, developed in a way that
distinguishes between large-scale, mission-critical, always-on
insight requirements and more focused, intermittent insight
needs, are able to provide users with a richer set of options. Just
as important, a modularized, cloud-based approach can reduce
costs, since the organization isn’t using compute services around
the clock, like living in a house where the lights are never left on
in unoccupied rooms.
organizations most likely to succeed are those able to devote
resources to analytics innovation and go beyond simply
monitoring and managing the baseline analytics tools embedded
in industry cloud solutions. Operationalization is also a big part
of the appeal of industry clouds in terms of innovation. Too often,
after teams develop a successful innovation, they struggle to roll
it out more broadly across the organization. The reasons are
technical and related to resource allocation. But the more modular
THE TRUST IMPERATIVE
the analytics tools, the easier they are to operationalize in other
parts of the organization. The most successful are lightweight
and ready to be dropped into different initiatives.
Early boon for “challenger” organizations
As industry clouds become the norm, midtier and “challenger”
Ask technology and business leaders across industries for their
thoughts on new industry cloud offerings, and you’re likely to find
that they are bullish on the opportunity but hesitant to adopt.
This is understandable. Whether the organization already has
organizations are likely to see the most benefit. Industry leaders will
sophisticated analytics capabilities or is earlier in their journey,
typically have advanced analytics capabilities in place, so they are
technology leaders are skeptical of implementing any off-the-
less likely to see upside in off-the-shelf analytics capabilities. At the
shelf analytical model, even those designed specifically for their
same time, however, these more mature organizations are likely
to benefit by selling versions of their analytics tools and models
in the marketplace. In this scenario, everyone benefits – mature
organizations create a new revenue stream from existing IP, and
up-and-comers get the benefit of their more advanced peers’
hands-on experiences and insights. “Model marts” may become
the norm in some industries.
Among industries that have less mature analytics capabilities
overall, compared with their peers in retail and banking, industry
clouds could set off a rush of insight, because industry leaders
and up-and-comers alike will avail themselves of these tools.
industry. Large organizations are especially hesitant after working
years to develop and improve their own data centers. Even if those
data centers have problems, they are known problems.
Model transparency is critically important for overcoming any
trust deficit in new industry cloud-based analytics capabilities.
Models should come pre-validated, and organizations should be
able to replicate results.
Emerging vertical cloud offerings should give industry users the
confidence they need to adopt and operationalize industry cloud
analytics capabilities as industry clouds become more widely
available.
SAS PERSPECTIVE
Blockchain: ready for business
Blockchain: ready for business
Using analytics to make blockchain a practical reality
As blockchain continues to evolve out of the hype cycle and
into its role as a practical business enabler, leaders across
industries need to find ways to square blockchain’s exotic
reputation with its practical, everyday potential. That can
be difficult given blockchain’s strong connections to the
emergence of NFTs, cryptocurrencies and other headlinegrabbing technology developments that can make blockchain
appear unsuited to more traditional environments and
challenges. But blockchain has significant potential to
transform more traditional business realms. This is especially
true, as Deloitte notes, as organizations are actively reimagining
“how they make and manage identity, data, brand, provenance,
professional certifications, copyrights, and other tangible and
digital assets.”
Analytics-based approaches are likely to be one of the most
effective ways to connect blockchain’s high-flying potential
with day-to-day business needs. Plus, in some industries,
analytics are likely to be required by regulators looking to
ensure security. In 2022, the New York Department of Financial
Services (DFS) issued guidance recommending that all digital
currency companies operating under New York banking law
adopt blockchain analytics to trace transactions. As a “blockchain
bellwether,” the financial services industry offers a glimpse
of the future awaiting other industries experimenting with
blockchain.
Here are three of the most likely ways analytics will be deployed
alongside blockchain capabilities to make them a true force in
business.
Blockchain: ready for business
Focus on fraud first
are already a host of other pressing insight needs in other parts
Much of the appeal of blockchain comes from the anonymity
of the business. But without a good answer, it is impossible for
it provides to users. Anonymity is perhaps the core feature of
business and technology leaders to move ahead with adoption
blockchain – and it opens the door to a host of possibilities to
to take advantage of blockchain’s powerful capabilities.
organizations in terms of security and privacy. It also presents
a tantalizing target for those wishing to commit fraud.
In the short term, many will conclude that support from
third parties offers the most direct, practical, accelerated
Today, this is one of the most significant barriers to the
path forward. Already, the number of blockchain-focused
mainstream adoption of blockchain. How can organizations
technology and analytics service providers has begun to grow
monitor, detect and prevent fraud in an environment created
rapidly, building on experiences gained in cryptocurrency
specifically to deliver anonymity, using anonymized registers
markets and other early-phase staging areas for blockchain.
and data sources? By using the same tested, proven analytical
approaches to fraud prevention they’ve used for years
elsewhere in their organizations. Blockchain represents a
potentially transformative break from the mainstream – and
for mainstream users to adopt and apply it in a meaningful
way, they need the confidence of knowing that they can keep
fraud in check. Analytics tools offer the most powerful way to
deliver order and confidence in a blockchain environment.
Build a bridge to regulators with analytics
Blockchain is already on the regulatory radar, due primarily to
the explosion of blockchain-enabled cryptocurrencies. Today
the regulatory environment is highly splintered, with different
regions, countries and jurisdictions all bringing their own
different approaches to regulation. It is also primarily focused
on applications of blockchain in the financial services industry.
In coming years, these approaches will begin to coalesce as
Lean on third parties
regulatory experiences point the way to best practices. They
Who’s going to run analytics on our blockchain activities?
will also expand to address a wider range of industries, as
This can be a difficult question to answer in organizations
blockchain continues its march into other areas.
where analytics talent is already stretched thin, and there
BLOCKCHAIN ANALYTICS: BEYOND FRAUD
Along the way, regulators will demand more information
on organizations’ risk exposure due to blockchain activities.
In that regard, regulators are seeking essentially the same
information as the organizations themselves – they want
Analytics has a role to play anywhere there is data – and blockchain
activities generate massive volumes of it. As technology and
business leaders begin to experiment with blockchain, analytics
to know exactly where and how blockchain is being used,
should be part of the plan. The most pressing reason to incorporate
which threats the organization faces as a result, and how
analytics practices into the blockchain strategy is to satisfy existing
the organization is responding. This is exactly the type of
or upcoming regulatory requirements in fighting fraud. As New
information analytics systems should be configured to
address. Forward-thinking organizations (and the third
parties who support them) can implement these systems
in ways that enable data-sharing with regulators, smoothing
the path to compliance as regulators pursue enforcement
of new blockchain-focused laws as they emerge.
York’s DFS has shown, regulatory interest in blockchain analytics
is already significant in some quarters, and this will continue as
blockchain adoption expands.
However, using analytics solely to satisfy fraud-related regulatory
requirements in blockchain would be a mistake. How are an
organization’s employees, customers, partners and others using
blockchain to engage? What patterns are already in place – and
what future patterns are likely to emerge? Where is blockchain
working well – and where is it falling short of expectations? What
are the most immediate opportunities to use blockchain to
transform other parts of the organization? These are the types
of questions that analytics can help leaders answer as they expand
their focus beyond regulatory compliance and take advantage of
the full value of the data generated by blockchain. Security first –
but don’t plan on stopping there.
SAS PERSPECTIVE
Automating at scale: 3 key analytics
questions
Automating at scale: 3 key analytics
questions
Analytics implications and practical insights for
moving ahead
Is it any surprise that Deloitte has identified large-scale
automation as one of the key technology trends expected
to shape the future? The events of the past few years have
led business and technology leaders alike to redouble
their efforts to embrace transformative technologies –
and automation tops the list for many, given its ability to
provide continuity under challenging conditions, as well
as its ability to support human operators who are stretched
thin in the face of a talent shortage.
Just as important, foundational automation capabilities are
proven and tested in the real world. Automation is happening –
and it’s been underway for years. The main difference today,
as Deloitte notes, is that the tide is turning: Automation is
reaching a phase of critical mass, momentum and scale that
will make it an unavoidable force across industries.
All of which brings us to analytics. Given the amount of
data required to inform the creation of automation tools
and operate them over time, not to mention the volumes
of data automation generates, analytics plays a predictably large
role in automation at scale.
Which repeatable processes are generating the
most data?
Repeatability is an obvious consideration for anyone looking
for processes that could be automated. But from an analytics
perspective, some processes are more “data-immersive”
than others – and those are the ones that present the most
opportunity in terms of the organization’s ability to replicate
them with analytics and automation, and to do so quickly.
Many organizations have focused their first automation efforts
on internal processes, as a way to minimize risk and gain skills
IT disrupt thyself: Automating at scale
and insights before they expand their automation strategies
of people in the organization can play a role in this aspect
to include customer-, partner- and public-facing processes.
of analytics in automation. That’s important, since data scientists
Internally, human resources processes provide a data-rich
and those in similar roles are in short supply for
environment full of repeatable processes, many of which
the foreseeable future.
already rely heavily on a self-service approach. Adding a layer
of automation to these processes, where embedded solutions
are already creating and capturing a significant amount of data,
can be a productive starting point for large-scale automation
ambitions.
SAS’ Jonathan Tottman has spent his career at the intersection
of analytics and law enforcement, and has observed a growing
disconnect: While a steadily increasing number of dynamic
analytical tools are available to support more effective
enforcement practices, the profession’s rigid attachment to
Do we have enough of the right people to ingest, understand
and act on the data?
traditional staffing models and policing patterns means that these
Automation generates a significant amount of data, which
today, I often ask them ‘of the 1,000 people in your organization,
in turn informs the development of insights – about how
how many are police officers?’ ”says Tottman. “When they reply
effectively automated processes are running, the outcomes
that all 1,000 are police officers, I tell them that today half should
they’re generating, where processes can be improved, and
be analysts, using analytics tools to help their departments become
much more. Because automation requires ongoing human
more effective in anticipating, preventing and mitigating crime.”
oversight in order to deliver its full value, it’s important to
This will only become more important as the nature of crime itself
ensure that automated processes are supported by enough
evolves to become more digital, with technology-driven crime
human operators to ensure that the analytics-derived insights
emerging as a dominant trend.
they are generating are being acted on appropriately.
tools remain underutilized in practice. “When I talk to police leaders
While this sort of activity was once the sole domain of business
Which analytics capabilities are already available in our
commercial solutions?
analysts and data scientists, advances in the delivery and
Many providers of commercial solutions, anticipating a boom
presentation of analytics insights mean that a wider range
in demand for AI, have been building analytics and AI capabilities
THE MOST IMPORTANT BARRIER REMAINING
into their solutions. Much like new features that are embedded in
commercial technologies such as phones and televisions, these
As with many technology advances, as the technical barriers to
additions may be underutilized or even unknown to those who
adoption fall away, they reveal a more persistent obstacle: cultural
own them.
change. Just as analytics requires a level of openness to insights
Cloud offerings are a prime example of this phenomenon.
that are sometimes uncomfortable or unexpected, successful AI
Each of the leading cloud providers is working with partners to
adoption requires a significant mindshift. Executives and front-
deliver fully integrated, advanced analytics capabilities as part
line workers alike can be expected to resist ceding aspects of their
of their service packages. Similarly, analytics providers like SAS
work to automation, and to question the efficacy and accuracy of
have launched cloud-native and cloud-ready solutions that are
designed to sync with commercial cloud offerings. This is the case
for virtually every enterprise-level solution available today. Before
embarking on any AI and analytics journey, no matter what
size, IT and business leaders should take stock of the potentially
underused or overlooked capabilities already available to them.
tasks and decision-making activities that have been automated.
Analytics insights may have a higher-order role to play here as
well. How is automation really performing? What is the impact on
our efficiency, and on the quality of our decision making? What
is the scale and nature of improved business outcomes resulting
from our automation initiatives? Good answers to questions like
these can help mitigate the cultural barriers that stand in the way
of automation – and an analytics-driven approach can help provide
employees at every level with the insights they need to quickly
adapt to an automation-enabled environment.
SAS PERSPECTIVE
Cyber AI: real defense
Cyber AI: real defense
Force multiplier that is cyber AI + analytics
Cybercrime isn’t new. But the scale of cybercrime continues
to explode, leaving business and technology leaders alike
scrambling to find effective strategies for defending against
criminals who are using more sophisticated techniques at
every turn.
The stakes are high. As Deloitte notes in its Tech Trends 2022
report, the cost of cybercrime is expected to grow to US$10.5
Fortunately, a host of analytics strategies and capabilities have
trillion by 2025, compared to an estimated $6 trillion by
already been tested on the cybercrime battlefield. Most were
the end of 2021. Deloitte also cites insurer AIG’s report that
not deployed in an extensive AI context – but that may be about
ransomware claims alone have grown 150% since 2018.
to change quickly. Here are some of the analytics approaches
that, depending on the industry in which they are applied, are
Deloitte is right to note that “cyber AI can be a force multiplier
poised to create a “force multiplier” effect with AI.
that enables organizations not only to respond faster than
attackers can move, but also to anticipate these moves and
Comparison analytics
react to them in advance.” As these cyber AI capabilities ramp
When a network-connected machine or device suddenly
up, it’s time to consider exactly which analytics approaches are
operates differently, is this the result of suspicious activity,
likely to play a leading role in the large-scale deployment of AI
or simply the result of something harmless and explainable?
in fighting cybercrime.
Cyber AI: real defense
Comparison analytics is a strategy for identifying anomalies
when the systems themselves often don’t interface with
and patterns and zeroing in on those that may signal
one another. Regulatory constraints can also make
criminal activity.
establishing a baseline a daunting task.
For example, a user may engage in the same activities day
Technology advances are making this effort more
after day – sending 10-25 emails before 10 a.m., going quiet
straightforward. After all, digital systems generate data at
for several hours, scanning emails intermittently throughout
every turn. This data can be loaded directly into monitoring
the day, and logging into the same three systems. When
systems equipped with pattern recognition tools, deep
that predictable pattern is disrupted – when the user
learning and other analytics-driven features. Such capabilities
accesses an entirely different system or taps into non-routine
can help identify patterns and make connections beyond the
processes and data within systems – it could be for good
grasp of human analysts due to the scale and volume of data
reasons, or that activity could signify that the machine
involved. Technology leaders should start by determining
has been compromised.
which of their systems are most susceptible to being
Comparison analytics tools can help identify these types
of signals and flag potential risk markers using peer grouping
compromised, then expand from system to system as their
organizations gain proficiency.
strategies, clustering algorithms and other widely deployed
Rarity analytics
approaches. But this strategy can only succeed when the
When network or digital asset users do something completely
organization has established baseline patterns of “normal”
out of the ordinary, they could have a legitimate reason
activities. That’s no small feat, which is why this approach
for doing so, or they could be introducing cyber risk to the
can be so difficult to get off the ground.
organization. Meanwhile, relatively rare or uncommon events
It can be expensive and time-consuming to map digital
asset activities, identify when and how individual employees
and departments typically use various systems – especially
such as monthly maintenance updates present opportunities
for cyber criminals to exploit networks. Both circumstances
should be closely monitored for breaches. Rarity analytics
capabilities help identify risk-significant rare events as well as more
RISING TO MEET THE CHALLENGES
OF SCALE WITH AI
closely monitor user behaviors during routine-but-uncommon
events such as system updates.
The SolarWinds hack of 2020, in which hundreds of large
Any of the analytics capabilities described here can have
a powerful impact on their own. But when paired with AI,
organizations worldwide (including many US government
they can help organizations rise to the meet the challenges
departments and agencies) suffered extensive data breaches,
of scale, in an environment in which cybersecurity threats
underscores the role that rarity analytics can play in defending
are more complex and growing in number. Given the scarcity
against attacks. During that attack, organizations with rarity analytics
capabilities in place suddenly began producing thousands of alerts
of both cybersecurity and analytics talent today, which
each hour. These procedures generated alerts to spur defensive
shows no sign of relenting in the near future, AI is even
actions for identifying and mitigating threats.
more important as a tool for meeting the growing scale
Temporal analytics
of cyber threats. As Deloitte says, “it’s time to call for AI
Imagine you’re a cybercriminal seeking to exfiltrate 200 terabytes of
backup.” Organizations should be actively exploring
customer data. Attempt to do it all at once, and you’re likely to trip
opportunities for developing their AI acumen by combining
the alarms of even the most basic systems. But if you’re patient and
siphon off a single megabyte or two each day over time, you may fly
under the radar and get all the data you want.
Slow-and-steady attacks of this nature are extremely difficult to
detect without the help of temporal analytics, which analyzes
network traffic and all digital assets on the network over time.
Be it a week, three weeks, six months, a year, temporal analytics
capabilities seek discreet patterns that would otherwise go
undetected or be uncovered long after the damage was done.
AI and analytics capabilities to make sure they’re ready
for what’s next.
SAS PERSPECTIVE
The tech stack goes physical
The tech stack goes physical
Managing physical assets
According to Deloitte’s Tech Trends 2022 report, IT leaders
have one more important, emerging category of technology
infrastructure to manage: Physical assets. That’s because
the proliferation of smart devices and other sensor-enabled
physical objects, from smart factory equipment and inspection
drones to health monitors and many more, has expanded
the reach of IT far beyond the realm of digital-only solutions
and technologies. For CIOs, it won’t be enough to ensure that
digital systems are properly implemented, integrated, and
functioning as expected on behalf of the enterprise. They’ll
also need to consider how digital technologies are integrated
with a physical tech stack that is spread across geographies –
and how they are operating in often-challenging environments
over time.
There are a host of analytics implications for this shift in the
technology landscape. Perhaps the most obvious change is in
the range and volume of data being generated by a teeming
portfolio of sensor-enabled physical assets – an issue that
many companies are already grappling with as they pursue IoT-based
strategies. In short, a lot more data can mean a lot more opportunity
for insight – if you have the right tools to make sense of it.
But there is an equally important application of analytics in this
context: Monitoring and managing all those physical assets. Because
without maintaining tight control over which assets are in place, what
they are responsible for doing, how they are performing, and when
and how they are degrading over time, the data they are generating
may not be particularly useful to end users.
The tech stack goes physical
Early progress in analytics
In one US municipality pursuing this strategy, the IT team
Analytics approaches are already delivering strong capabilities
responsible for operating the sensors identified a number of
in this arena. Where only a few years ago it was considered
concerns after the sensors were set in place, including:
a victory to be able to anticipate when classes or types of
devices would likely need to be replaced or repaired, allowing
organizations to preemptively pull devices out of commission
for predictive maintenance, today it is possible to track the
performance of individual assets in the physical tech stack.
Think of it as a specific predictability rather than generic
predictability. That’s a new level of power and insight for any
organization.
For example, in response to changing weather patterns and
other impacts of climate change, some national, state, and
municipal governments have launched strategies for using
an army of inexpensive sensors to track rising water levels
that may help more accurately predict flooding. With so many
sensors generating data, this presents new analytics challenges
and opportunities. Just as important, though, it creates a new
class of physical assets that must be managed in order to
ensure the overall success of the strategy, which hinges on
accurate, reliable, consistent data from low-cost sensors.
•
Too many false positives and false negatives.
•
Identical sensors exhibited different behavior.
•
Field sensors were often offline.
•
Intermittent, inconsistent data.
•
Unanticipated environmental challenges contributed
to sensor calibration issues.
For the CIO responsible for managing this sensorbased flood prediction strategy, these issues suggested
infrastructure-level problems. Plus, while the sensors
were relatively inexpensive to purchase, dispatching a
team to examine the status of so many of them, many of
which were placed in difficult-to-reach, even dangerous
locations, would be cost-prohibitive over time. Was a
sensor simply blocked by a blade of grass or spider
web that would eventually move and cease to present
a problem, or did it exhibit technical malfunctioning
requiring repair or replacement?
Without better tools for monitoring the sensors, human
intervention would be the only way to find out – a timeconsuming, resource-intensive proposition.
How analytics can help
Analytics-enabled “intelligent monitoring” capabilities can
INTELLIGENT MONITORING: ENABLING SCALE
AND ENSURING PERFORMANCE
Intelligent monitoring is only one of many ways in which
analytics capabilities will be used to better manage the large-
provide a much more focused, effective and efficient way to
scale emergence of the physical tech stack that is anticipated
monitor the health of data-generating physical assets, across
by Deloitte.
several different critical aspects:
But it is an important analytics-based strategy, because it can
•
Operations and process performance.
help establish a sturdy foundation for future efforts. It can also
•
Asset and equipment health.
guard against unintended consequences, especially in cases
•
Sensor and connectivity issues.
where IT leaders may be eager to take advantage of inexpensive
With intelligent monitoring, technology teams can also
new capabilities with little regard for the cost and effort of the
develop asset hierarchies that distinguish between different
ongoing management of those capabilities.
types of remote physical assets and create a map of each
– a water pump with four sensors for tracking everything
When planning your strategy for managing the emerging physical
from temperature to vibrations is managed differently than
tech stack, don’t just focus on how analytics can help make
a single-purpose sensor designed to track water depth, for
sense of the data these physical assets are generating. You also
example. These profiles can be used by field engineers and
others to assess how well or poorly the system is working,
all the way down to the level of individual sensors.
must consider ways in which analytics tools can help maintain
the high performance levels of the assets themselves, delivering
consistent, high-quality data from the front lines, wherever they
are stationed.
SAS PERSPECTIVE
Field notes from the future
Field notes from the future
Quantum computing for optimization
In a world where data streams continue to multiply and
computational problems become more sophisticated, we
need every solution possible to handle the compounding
growth in data and complexity.
While quantum computing has been discussed for decades
as a possibility, we are only just starting to see some
exciting implementations in areas like cybersecurity, drug
development and climate change.
Exponential intelligence requires responsible AI
The goal for analytics, machine learning and AI is to scale human
observation and decision making. It benefits us all to realize that
While quantum computing has important applications, its
human experiences and opinions naturally influence human
usefulness may be limited to those who can afford it and to
observation and decision making. Our feelings, opinions and
a particular set of problems that fit its unique criteria.
emotions can influence the way we go about solving problems.
In the analytics realm, optimization problems are the most
likely to be paired with the unique capabilities of quantum
computing. Classic optimization problems include flight
patterns and delivery schedules. But many complex world
problems like poverty, clean energy and clean water could
someday benefit from the combination of optimization and
quantum computing.
Our own biases can even influence the data we collect and use
to solve a particular problem.
Field notes from the future
Applying ethical considerations to analytics, machine learning
SAS has long had an interest in ambient analytics, where
and AI means we work to make sure technology does not harm
data is used in the background ubiquitously to make decisions
people but instead helps people thrive. We refer to this as
without human intervention.
trustworthy AI or responsible innovation. This requires a greater
level of oversight and awareness of potential issues throughout
the AI development cycle, not just the operational aspects of AI.
If the promise of exponential intelligence comes to fruition, with
Already, there are analytical decision points that happen
around us without our knowledge or input, including spot cloud
computing purchases, thermostat adjustments, traffic light
changes and online advertising displays.
computers developing uniquely human insights like compassion
and emotion, the importance of responsible innovation will
become even more crucial.
Ambient analytics will power ambient experiences
As described in the report, ambient computing will offer a
frictionless, proactive, intuitive “life beyond the glass.” In this
world, our technologies anticipate our needs before we do. With
“The mission of SAS is to
bring analytics everywhere,
to make it ambient.”
ambient computing, even before you can say “Siri” or “Alexa,”
Jim Goodnight
your ambient computing devices will adjust settings, make plans
CEO, SAS
and provide answers.
CONVERGING FUTURE TECHNOLOGIES
Ambient analytics becomes even more feasible when you bring
analytics to the data by cleansing, transforming, filtering and
Will quantum computing solve our most complex problems?
analyzing data at its source. When data has intelligence as it’s
Will exponential intelligence help AI reach a level of acumen that’s
issued, it can be directed from its source to use automatically.
even smarter and more ethical than human decisions? Will we find
If data is clean, relevant and has specific merit as it’s generated,
it becomes easier for preprogrammed and intuitive uses.
ourselves, through ambient experiences, surrounded by problemsolving computers that predict our needs before we speak to them?
Data is everywhere. And with ambient analytics, analytics will
be everywhere that data exists.
These types of optimistic scenarios sound enticing – and each of these
technologies intersect with analytics in powerful ways that could
shape the future.
Often, though, it’s not a single technology that changes our lives
but a convergence of multiple technologies. Look at the way IoT,
blockchain and online banking are converging today. Individually,
the technologies in this report each sound intriguing, but it will likely
be a combination of two or three of these technologies that usher
in the biggest changes.
Combining exponential intelligence with ambient analytics could
expand our interactions with analytics and lead to a broader
acceptance of data-fueled decisions in the world around us. As you
look into the future, we encourage you to explore combinations
of technologies from this report and elsewhere.
SAS Contributors
Ron Agresta, Senior Director – Product Management
Stu Bradley, Senior Vice President – Fraud & Security Intelligence
Gavin Day, Senior Vice President, Corporate Programs
Alyssa Farrell, Global Industry Marketer
Rusty Hamilton, Senior Account Executive
Bryan Harris, Executive Vice President & Chief Technology Officer
Patrick Hawthorn, IoT Sales Executive
Thorsten Hein, Principal Product Marketing Manager – Risk Research and
Quantitative Solutions
Alex Kwiatkowski, Director, Global Financial Services Industry Marketing
Steve Kearney, Global Medical Director