
A Guide to the Next-Gen Internet
Executive summary
More spatially immersive, compelling, and frictionless 3D web, viewable by virtual
reality (VR) and augmented reality (AR)
Understanding the metaverse
The metaverse is best understood as an evolution of the internet into a more spatially immersive, compelling, and frictionless 3D web, viewable by virtual reality (VR) and augmented reality (AR), along with traditional compute devices, with three key aspects: presence, interoperability and
standardisation. There are five key vectors for metaverse advancements, in our view: (1) hardware; (2) infrastructure; (3) content; (4) community; and (5) currency/settlement mechanism.
At this early stage in its emergence, there are of course varying views as to what constitutes the metaverse; in this report we outline how leading companies Meta, Microsoft, Google, Apple, NVIDIA, and Niantic conceptualise the metaverse, as well as perspectives from VCs and industry commentators.
Metaverse use cases are expanding
As is usually the case at the start of most such technology
evolutions, some existing applications leverage emerging
innovations to deliver a better and/or broader user
experience. However, eventually completely new-use cases
develop, as the technology evolves and matures and new
killer apps can take advantage of the better on-device
Figure 1: Essential components of the Metaverse
Metaverse
Evolution to a more
immersive 3D internet
Source: Company data, Credit Suisse estimates
4
processing and sensing, higher data rates, lower latencies
and machine learning enabled by AI. At this stage, we see
developments across gaming, entertainment, work
collaboration, social media, virtual worlds, education and
fitness. For businesses, we see advances in collaboration,
design, and commerce, with additional sector-specific
benefits to healthcare, real estate, and manufacturing.
We believe that three factors can drive additional growth in
the next few years: (1) a disruptive AR/VR headset maker
emerges amid more applications for enterprises and
consumers; (2) a better integrated hardware-software
platform emerges to drive penetration; and (3) technological
improvements such as micro-LED and fast LCD enable a
better price-performance product.
Hardware: Near-term focus on AR/VR devices
The demanding performance characteristics of metaverse
applications have effects up and down TMT beyond
consumer electronics makers. Semiconductors are levered
to the ever-rising trend to create, store, transmit, and
analyze data—we estimate the metaverse will drive a 100300 bp tailwind to our base 6-8% forward revenue CAGR
for the sector. Cloud, infrastructure, and telecom companies
look set to benefit too, with a fully immersive metaverse
experience challenging servers with a 100x harder compute
task than an AAA game server hosting Fortnite today, and
telecom access networks facing 24x more traffic in a
decade (with more demanding latency requirements to boot,
likely demanding edge compute and small cell deployments).
The metaverse should eventually bring profound changes to
the entire TMT sector, perhaps none as soon or as extensive
as in consumer electronics. In this report we examine the
special demands of the metaverse on AR and VR devices
where companies are expanding resources and innovation.
We also believe the metaverse will drive upgrades to
capabilities required on almost all hardware devices involved,
and may even lead to some dedicated “killer” hardware
products, such as the iPhone was for the 3G (mobile data)
era. Key pieces of the technology evolution include better
lens and display technologies, sharper on-device sensing
and faster processing, higher network data rates and lower
latencies allowing cloud compute resources, changes to
studio content production, and machine learning.
We expect global AR/VR headset unit sales of 42 mn units
(US$12.6 bn revenue) by 2025, representing a 48%
shipment CAGR (and 36% revenue CAGR) over 2020-25.
Sector views
We discuss the implications in three sections on the areas of
potential for each sector:
(1) Content: Includes regional internet sectors (China,
Korea, Japan, Europe and the US) and the US media,
Figure 2: The metaverse ecosystem map
Content
Software + Payments
US: Autodesk, Adobe
EU: TeamViewer,
OVHcloud
US: Warner Music Group,
Endeavor, Disney,
Comcast, Netflix,
ViacomCBS
US: Alphabet, Amazon, Apple,
Meta, Microsoft, Roblox, Unity,
Fortnite, Niantic
EU: Universal Media
Group, S4Capital, WPP,
Publicis Group
China: Tencent, NetEase
Semiconductors
Equipment/Material:
ASML, ASMI, TEL,
Chroma, Soitec, Screen,
Advantest
Foundry/IDMs: TSMC,
Win Semi, Intel, AMD,
Micron, STM, Infineon
Fabless/IC Design:
Nvidia, Qualcomm,
Broadcom, Marvell,
Mediatek, Novatek,
Realtek, Aspeed, Alchip
Japan: Sony, Gree, Gung Ho
Online, Imagica Group, Cybernet
Systems, Colopi, Gumi, Nintendo
Virtual Communities:
Decentraland, Horizon, RecRoom,
Sandbox
US: Arista Networks,
Cisco Systems
HW Components
Cloud Compute
US: Amazon, Google,
Microsoft, VMWare
China: Alibaba, Tencent,
Baidu
Metaverse
Ecosystem
EU: Ubisoft
Korea: NAVER, Kakao Corp,
NCSoft, Krafton, Netmarble,
NHN, Wemade, Com2us, Pearl
Abyss
IT Hardware
Asia: Inventec, Inspur,
Unisplendour, Innolight,
GDS, VNET, Chindata,
Quanta, Wiwynn, Accton
Platforms
Display: BOE, TCL, Visionox,
SeeYa Technology, Jade Bird
Display, Olightek, SIDTek, Token
Sciences, JDI, Sony, Samsung,
LGD, Himax, Kopin, TI
PCB/Materials: Unimicron, ZDT,
Ibiden, Shinko
Lens/Modules: SunnyOptical,
Largan, Genius, SEMCO, LG
Innotek, Patron, Monex,
Dreamtec, Jawha Electronics,
Hivision System, Powerlogics,
Namuga, Sekonix, Cammsys,
Haesung Optics
Telecom Services
China: China Mobile,
China Telecom, China
Unicom
EU: KPN
India: Bharti Airtel, Jio
Japan: NTT, KDDI
Korea: KT Corp, LG U+
US: AT&T, Charter,
Comcast, Verizon
Devices
Consumer +
Enterprise
Users
XR Device Assembly:
Goertek, Luxshare,
Pegatron, Hon Hai
Use case example
Apparel: Nike, Ralph
Lauren, PVH
Passives/Battery: Murata, Taiyo
Yuden, TDK
Backend: ASE, Amkor
Source: Credit Suisse
Global TMT Sector
5
software, and apparel sectors. New applications are
being pioneered to evolve today’s experiences in
gaming, entertainment, commerce, the monetisation of
online property, advertising, and both consumer and
enterprise social connectivity. The best-placed
companies for the metaverse era have scaled datacentre footprints, deep services and content, existing
gaming/video platform businesses, integrated
hardware, and/or 3D game engines.
(2) Hardware and semiconductors: Includes
components, assembly, display, hardware and
semiconductors. For semiconductors, the metaverse fits
into our data paradigm thesis, with the metaverse
permeating the create/capture, store, transmit and
analyse view of the data economy and poised to benefit
as improvement in power/performance unleashes more
data consumption and use cases for silicon. We see
silicon content expansion across edge devices (AR/VR
and traditional compute), the upgrade of WiFi to 6E/7,
increased 5G/6G penetration and broadband fibre
transmission, and upgraded cloud infrastructure (both
more compute resources, and more diffuse edge
compute). For hardware, the metaverse would drive
AR/VR opportunities in assembly, display,
camera/optics, PCB/substrate, MLCC, connectors and
batteries.
6
(3) Telecom and infrastructure: Includes the US, Europe
and Asian telcos, US towers, data centres and
equipment, and Asia cloud IT infrastructure. The
metaverse has enormous potential to further expand or
divert screen time (Americans average 10+ hours/day
on media, including 3+ hours on TV) and drive more
bandwidth consumption. Internet traffic is already 80%
video and has been growing at a 30% CAGR. Our team
projects even modest metaverse usage could drive a
further 37% CAGR over the next decade to 20x current
data usage. This will support the value of the bestconstructed networks in each region, while also
demanding upgrades to data centres and network
equipment.
“
Metaverse: An evolution into a
more immersive 3D internet.
Global TMT Sector
7
Understanding the metaverse
Immersive 3D internet with upgrades along five key vectors: (1) hardware; (2)
infrastructure; (3) content; (4) community; and (5) currency/settlement mechanism
A brief history
Science fiction. The first vision and naming of the
metaverse originated in the science fiction novel, Snow
Crash, by Neal Stephenson in 1992. In it, the metaverse
was a shared multiplayer online game made available over
the world’s fibre optics network and projected onto virtual
reality goggles. Users could control avatars that could
interact with other avatars and computer-controlled agents.
An avatar in that metaverse could gain status through its
technical acumen navigating the arena and gaining access
to exclusive spaces. Ready Player One by Ernest Cline in
2011 and adapted into a film in 2018, followed on with this
concept in a future view of the world in 2045 where users
escaped the real world by entering a metaverse called Oasis
accessed with a VR headset and wired gloves.
Multiplayer games and ‘avatars’. The use of avatars has
extended even further back from these novels. In the early
1970s, Steve Colley and Howard Palmer invested in a
multiplayer game called MazeWar that could be played over
ARPANET, a precursor to the internet. The game’s first
avatar had a graphical eyeball that moved through the maze
pointing in the direction it was travelling to shoot other
players. In the 1980s, the Commodore 64 computer had a
virtual world Habitat with cartoon-like avatars that could walk
around and communicate with chat bubbles. As the internet
ramped by 1994, WorldsChat created a space-stationthemed virtual space for avatars to have social interaction
and explore the various rooms. Out of that effort, a more
advanced programme, AlphaWorld, featured 700 themed
rooms or Active Worlds, with 12 different avatars, more
interaction with the game and reaching 250,000 cumulative
users. Other services, including Worlds Away, Virtual
Places, Comic Chat and The Palace, also offered these
virtual rooms.
Communities and services. In 2000, a Finnish company
created Habbo (formerly Habbo Hotel), an online community
that has accumulated 316 mn avatars since launch and now
has 800,000 active users. The main feature in the game is
a hotel where users can visit public areas (restaurants,
cinemas and clubs) and create guest rooms. The users in
this community can create a character, design and build
rooms, chat with other players and take care of virtual pets.
The early services formed the building blocks for Second
Life, a virtual online world that launched in 2003. By 2013,
it had 36 mn accounts created, with 1 mn monthly active
users who had spent 217,000 cumulative years online on
territory comprising over 700 square miles and spending
US$3.2 bn on in-world transactions.
The users in Second Life created avatars to interact with
places, objects and other avatars through chat, IM or voice.
The avatars could take any form or resemble their real-life
form and could travel by walking, vehicle, flying or
teleporting. The community allowed a variety of socialising,
games, group activities, and opportunities to build, shop,
create and trade property and services. The service also
used a virtual currency to buy, sell, rent or trade goods and
services. The goods could include buildings, vehicles,
clothes, art or jewellery, and services could include
entertainment, custom content, or business management.
Figure 3: Ready Player One Participants in the world of
VR
Figure 4: Second Life virtual reality world
Source: Industry
Source: Industry
8
Second Life went into decline as it was usurped by other
social media platforms and did not adapt well to a mobile
platform. Second Life’s chief architect, Philip Rosedale, in
an iEEE Spectrum interview in November 2021 noted
limitations with adults wanting to socialise with strangers
online, technical challenges getting more than 100 people
together in a copy of a concert space, a need for better
toolkits for large numbers of people to build the experiences
and content, and a better common currency that can unify
the diverse tokens that each platform uses. He also views
VR still having issues to solve around comfort, typing speed
and communicating comfortably with others.
A reporter from Reuters in a series published from 2007-08
noted issues including limited support to new joiners to
make the most of the platform, an overly complicated user
interface, IT issues (crashes and unstable IM), and a high
weighting towards adult content. They still did note an
incredible depth, passion and camaraderie in the community
and some interest in being able to buy a grid of space and
mould it into something.
Meta Quest. While the renaming of Facebook did bring
increased attention to the concept, there isn’t really an
agreed definition as to what the metaverse is or as to how it
will evolve. In this section, we outline how various key
technology companies are viewing and defining this
opportunity along with providing frameworks from two
industry experts/commentators.
CS’ summary view of the metaverse
While industry participants hold various views as to what
constitutes a metaverse, in our view it essentially boils down
to an evolution into a more immersive 3D internet with
upgrades along five key vectors (and different
commentators’ sub-divisions of these vectors):
Devices/hardware: The key interface between the
user (humans) and the metaverse. These could be
smartphones (which evolve and add functionality over the
coming years) and/or could be dedicated or linked
AR/VR devices or could be a completely new dedicated
hardware.
The Second Life site still claims 750,000 monthly active
users on the platform and US$650 mn in annual
transactions, though this is marginal relative to the major
social media platforms and never ramped much above 1 mn
people. The creators of Second Life followed up with a VRbased virtual world called Project Sansar but it did not ramp
up well. The company returned to focussing on Second Life
and sold Project Sansar to Wookey Project Group. That
group is now focussing on virtual concerts including
pre/after parties.
Infrastructure: The network and devices that connect
the hardware device to the content—5G networks, WiFi,
edge computing implementations and eventually 6G.
Metaverse: A still-evolving concept
Currency/settlement: The method used to “settle”
transactions for participation, content creation or direct
commerce.
The metaverse has seen a substantial increase in awareness
in past months with Facebook’s renaming of the company
as Meta in October 2021 and a focus on driving all of its
efforts towards building out the metaverse, including the
most recent renaming of its Oculus Quest VR glasses as
Content: All the various types of software and content,
including gaming.
Community: All the various use cases with many
(theoretically unlimited) individuals/users who interact
and socialise within the platform and also across
applications/platforms (use cases).
Figure 5: The metaverse ecosystem already growing diverse
Source: Building the Metaverse—Jon Radoff
Global TMT Sector
9
In the following section, we further dig into the definition of
and the key concepts around building out the metaverse
from various leading companies in the technology world as
well as views/definitions of the concept from some
prominent industry commentators.
Meta’s view: Metaverse as an embodied internet
Metaverse comes from the Greek word “Beyond” and is
about creating a next generation of the internet beyond the
constraints of screens and physics. The metaverse is
expected to be the next platform for the internet with the
medium even more immersive—an embodied internet where
people are in the experience, not just looking at it. Users will
be able to do almost anything they can imagine: get
together with friends and family, work, learn, play, shop and
create, with entirely new categories not available on
phones/computers today.
The metaverse will be the successor to the mobile internet,
enabling people to be able to feel present and express
themselves in new immersive ways. That presence should
allow people to feel they are together even if they are apart,
whether in a chat with family or playing games and feeling like
they are playing together in a different world of that game and
conducting meetings as if face to face. The embodied internet
would mean, instead of looking at a screen, users would feel
they are in a more natural and vivid experience while
connecting socially, and during entertainment, games and
work, by providing a deep feeling of presence.
Meta believes several foundational concepts are
required for the metaverse:
Presence. The defining quality of the metaverse—this
should enable the ability to see people’s facial
expressions and body language, and feel in the moment
by being more immersed.
Avatars. Avatars will be how people represent
themselves rather than a static profile picture. The codec
avatar is a 360-degree photorealistic avatar that can
transform the profile image to a 3D representation with
expressions, and realistic gestures that can make
interactions richer. The avatar could have a realistic
mode but also a mode used for work, socialising, gaming
and clothing designed by creators that can be taken
across different applications.
Home space. The home space can recreate parts of
the physical home virtually, add new parts virtually and
add in customised views. The home space can store
pictures, videos and purchased digital goods, have
people over for games and socialising, and a home office
to work.
Teleporting. A user can teleport anywhere around the
metaverse to any space just like clicking a link on the
internet.
Interoperability. Interoperability would allow someone
to buy or create something that is not locked into one
platform and can be owned by the individual rather than
the platform. Meta is building an API (Application
Programming Interface) to allow users to take their
avatar and digital items across different apps and
experiences. The interoperability would require
ecosystem building, norm setting and new governance.
Privacy and safety. The metaverse needs to build in
privacy, safety, interoperability and open standards from
the start, with features allowing a user choice in who
they are with, the ability to be private or to block another
user. The metaverse needs easy-to-use safety controls
and parental controls, and also to take out the element
of unexpected surprises.
Virtual goods. The metaverse would allow the ability to
bring items into the metaverse or project those into the
physical world. A user can bring any type of media
represented digitally (photos, videos, art, music, movies,
books and games) into the metaverse. These items can
also be projected into the physical world as holograms or
AR objects too. Street art could be sent over and paid for.
Clothing can also be created that is accurate, realistic and
textured, and that can be purchased and bought.
Natural interfaces. All kinds of devices will be
supported. The metaverse will have the ability to be used
on all types of devices ranging from using virtual reality
glasses for full immersion to AR to still be present in the
physical world, or through the use of a computer or
phone to quickly jump in from existing platforms.
Interaction and input can be through typing or tapping,
gestures, voice recognition or even thinking about an
action. In a future world, the user would not even need a
physical screen as they could view a hologram for the
images throughout the virtual world.
Figure 6: Meta’s Horizon Home, World, and Workroom
modes
Figure 7: Meta developing tools for AR overlays on the
World
Source: Meta
Source: Meta
10
Microsoft’s view: Bringing people together and
fostering collaboration
Microsoft defined the metaverse as a persistent digital world
inhabited by a digital representative of people, places, and
things. The metaverse can be thought of as a new version of
the internet where people can interact as they do in the
physical world, and gather to communicate, collaborate and
share with personal virtual presence on any device.
The company views the metaverse as no longer a vision,
citing already existing-use cases such as the ability to go to
a concert and shows with other real people inside a video
game, the ability to walk a factory floor from home or to join
a meeting remotely but be in the room remotely to
collaborate with other workers.
The company believes the metaverse has the ability to
stretch us beyond the barriers and limitations of the physical
world, which proved to be a larger barrier when COVID-19
prevented work from the office or travelling to visit clients,
friends, and family. Microsoft is working on tools to help
individuals represent our physical selves better in the digital
space and bring that humanity with the person into the
virtual world. Some of the capabilities it is enabling is about
teammates joining meetings from everywhere and real-time
translation allowing people from different cultures to
collaborate in real time.
Microsoft is driving big investment into virtual connectivity in
gaming. It discussed its view that in gaming, the metaverse
would be a collection of communities with individual
identities anchored in strong content franchises accessible
on every device.
The company also views its Azure offerings for business as
well suited for the metaverse as noted from Satya Nadella at
the May 2021 Build Developer Conference: (1) with Azure
Digital Twins, users can model any asset or place; (2) with
Azure IoT, the digital twin can be kept live and up-to-date;
(3) Synapse tracks the history of digital twins and finds
insights to predict future states; (4) Azure allows its
customers to build autonomous systems that continually
learn and improve; (5) Power Platform enables domain
experts to expand on and interact with digital twin data using
low-code/no-code solutions; and (6) Mesh and Hololens
bring real-time collaboration.
Google: Development of AR/VR could reboot with its
ambient computing push
Google has been an early visionary for mixed reality
products, having introduced its Google Glass for developers
in 2013 and smartphone-driven VR systems in the form of
Google cardboard in 2014 and Daydream headset in 2016.
The company has a goal of driving ambient computing,
which means users can access its services from wherever
they are, and they become as reliable and essential as
running water. The company’s hardware strategy around
smartphones/tablets, Nest home devices and potential reemergence into AR/VR are tied to this ambient compute
experience.
Figure 8: Microsoft collaboration in the metaverse
Figure 9: Real time translation can improve
communication
Source: Microsoft
Source: Microsoft
Figure 10: Google Glass Enterprise Edition Two
Figure 11: Google Daydream smartphone-based VR
Source: Google
Source: Google
Global TMT Sector
11
The company recently hired former Oculus GM, Mark
Lucovsky, who is now leading Google’s operating system
(OS) team and experiences delivered on top of the OS for
AR. Google in its recruiting posts for AR hardware
developers, hardware engineers and software developers
indicated it is building the foundations for substantial
immersive computing, and building helpful and delightful
user experiences to make it accessible to the billions of
people through mobile devices. The company also indicated
it includes building software components that control and
manage the hardware on its AR products.
Apple: Designing its ecosystem already around AR
Apple views AR transforming how people work, learn, play,
shop, and connect with the world, and the perfect way to
visualise things that would be impossible or impractical to
see otherwise. Apple claims it has the world’s largest AR
platform, with hundreds of millions of AR-enabled devices
and thousands of AR apps on its Apps store. CS expects
Apple to launch its first mixed reality device in late 2022
manufactured by Pegatron, though the initial projects are for
small unit volumes (1-2 mn). The product may be the first
stage to unleash more creativity among its develop
community to move the AR from phone/tablet viewing to 3D
mixed reality viewing.
Nvidia: Omniverse to create and connect worlds within
the metaverse
Nvidia defined the metaverse in its August 2021 blog as a
shared virtual 3D world, or worlds, that are interactive,
immersive and collaborative and as rich as the real world. It
views it as going beyond the gaming platforms and video
conferencing tools aimed at collaboration. The metaverse
would become a platform that is not tied to any one app or
any single digital or real place. The virtual places would be
persistent, and the objects and identities moving through
them can move from one virtual world to another or into the
real world with AR.
Niantic’s view: Metaverse driven by AR
Niantic, developer of Pokemon Go, which was originally
spun out of Google, published a blog in August 2021
building its vision for the metaverse around AR rather than
VR. It views the world in science fiction novels such as Snow
Figure 12: Apple ARKit 5 effects’ overlay
Figure 13: Apple AR creation to convert models for AR
use
Source: Apple
Source: Apple
Figure 14: Nvidia’s Omniverse industry use cases
Source: Nvidia
12
Crash and Ready Player One as a dystopian future of
technology gone wrong where users need to escape a
terrible real world with VR glasses to go into the virtual
world. The company views VR as a sedentary process
slipping into a virtual world and being cut off from everyone
around you with an avatar as a poor substitute for the real
human-to-human interaction. The company believes VR
glasses remove the realistic interactions from the presence
that you can sense being with people that are difficult to
replicate staring into OLED display goggles.
Niantic is also developing a visual positioning system (VPS)
that can place virtual objects in a specific location so those
objects can persist to be discovered by other people using
the same application. With a live production code it has
mapped thousands of locations. Niantic is attempting to
build a much more in-depth digital map beyond Google
Maps which can recognise location and orientation anywhere
in the world leveraging on computer vision and deep-learning
algorithms, and the leverage of the millions of users playing
its games such as Pokemon Go.
Niantic is leaning into AR in order to be able to be outside
and connect with the physical world with AR as an overlay to
enhance those experiences and interactions, and get people
back outside and active by learning about their city and
community.
The company’s vision follows Alan Kay’s 1972 Dynabook
paper that discussed the trend of continuing to shrink
compute (from mainframes now down to smartphones/
wearables) and eventually to compute devices disappearing
into the world. Niantic views shifting the primary compute
surface from the smartphone to AR glass to remove the
demands on hands to make it easier to access data and
services, and view overlays on the real world. Niantic has
partnered with Qualcomm to invest in a reference design for
outdoor-capable AR glasses that can orient themselves
using Niantic’s map, and render information and virtual
worlds on top of the physical world with open platforms
allowing many partners to distribute compatible glasses.
The company’s view of the metaverse is a world infused with
“reality channels”, where data, information, services and
interactive creations can be overlaid on the real world. The
company incorporated these into its products Field Trip,
Ingress and Pokemon Go as games that can make the
world more interesting. The capability, though, can stretch
beyond games and entertainment, as the AR can allow
education, guidance, and assistance anywhere from work
sites to knowledge work.
Figure 15: Omniverse allows collaboration on a 3D
project
Figure 16: Nvidia’s Omniverse platform
Source: Nvidia
Source: Nvidia
Figure 17: Niantic’s metaverse uses AR to overlay digital objects in the physical world
Source: Company data
Global TMT Sector
13
Industry commentators’ views
Different companies and participants interpret what
metaverse means differently. We provide views from some
prominent industry commentators, one a venture capitalist
and another an entrepreneur, regarding what a metaverse
entails.
A. A venture capitalist’s definition of the metaverse and
its development vectors
Venture capitalist (VC) Matthew Ball defined (here and
here) the metaverse as a “massively scaled and
interoperable network of real-time rendered 3D virtual
worlds which can be experienced synchronously and
persistently by an effectively unlimited number of users
with an individual sense of presence and with continuity
of data, such as identity, history, entitlements, objects,
communications and payments”.
The metaverse, in his view, should be viewed as a
quasi-successor state to the mobile internet as it would
not replace the internet but will build on it and transform
it just as mobile devices changed the access,
companies, products/services and usage of the
internet. As with mobile internet, the metaverse is a
network of interconnected experiences and applications,
devices and products, and tools and infrastructure. The
metaverse places everyone in an embodied, virtual or
3D version of the internet on a nearly unending basis.
Some characteristics of the metaverse are that it would
be: (1) persistent; (2) synchronous and live; (3) without
caps on users and providing each user with an individual
sense of presence; (4) a fully functioning economy; (5)
14
an experience that spans digital and physical worlds,
private and public networks, and open and closed
platforms; (6) offer unprecedented interoperability of
data, digital items/assets and content; and (7)
populated by content and experiences created and
operated by an incredibly wide range of contributors.
The VC is tracking the metaverse around eight core
categories:
(1) Hardware: Technologies and devices to access,
interact and develop the metaverse (VR, phones,
haptic gloves).
(2) Networking: Development of persistent real-time
connections, high bandwidth and decentralised data
transmission.
(3) Compute: Enablement of compute to handle the
demanding functions (physics, rendering, data
reconciliation and synchronisation, AI, projection,
motion capture and translation).
(4) Virtual platforms: Creation of immersive and 3D
environments/worlds to stimulate a wide variety of
experiences and activity supported by a large
developer and content creator ecosystem.
(5) Interchange tools and standards: Tools,
protocols, services and engines to enable the
creation, operation and improvements to the
metaverse spanning rendering, AI, asset formats,
compatibility, updating, tooling and information
management.
(6) Payments: Support of digital payments including
fiat on-ramps to pure-play digital currencies/crypto.
(7) Metaverse content, services, and assets: The
design, creation, storage and protection of digital
assets such as virtual goods and currencies
connected to user data and identity.
(8) User behaviour: Changes in consumer and
business behaviour (spending and investment, time
and attention, decision making and capability)
associated with the metaverse.
B. An entrepreneur’s view: The seven layers of the
metaverse
Jon Radoff, CEO of Beamable, a Live Game Services
platform, is another prominent industry commentator on
the topic of the metaverse and is widely quoted by
various articles associated with the concept. His prior
work has focussed on online communities, internet
media and computer games. Jon sees the metaverse
(link) as composed of seven layers:
(1) Experience: The experience layer is where users
do things in the metaverse including gaming,
socialising, shopping, watching a concert or
collaborating with co-workers. The metaverse
experiences do not need to be 3D or 2D, or even
necessarily graphical; it is about the
inexorable dematerialisation of physical space,
distance and objects. When physical space is
dematerialised, formerly scarce experiences may
become abundant.
(2) Discovery: The discovery layer is about the push
and pull that introduces people to new experiences.
Broadly speaking, most discovery systems can be
classified as either inbound (the person is actively
seeking information about an experience)
or outbound (marketing that was not specifically
requested by the person, even if they opted in).
The discovery layer could include the curated
portals, online agents, rating systems and
advertising networks drawing users to discover
different areas.
(3) Creator economy: Not only are the experiences
of the metaverse becoming increasingly immersive,
social, and real-time, but the number of creators
who craft them is increasing exponentially. This
layer contains all of the technology that creators
use daily to craft the experiences that people enjoy.
(4) Spatial computing: Spatial computing has
exploded into a large category of technology that
enables us to enter into and manipulate 3D spaces,
and to augment the real world with more
information and experience. The key aspects of
such software includes: 3D engines to display
geometry and animation; geospatial mapping; voice
and gesture recognition; data integration from
devices and biometrics from people; and nextgeneration user interfaces.
(5) Decentralisation: The ideal structure of the
metaverse is full decentralisation. Experimentation
and growth increase dramatically when options are
maximised, and systems are interoperable and built
within competitive markets. Distributed computing
powered by cloud servers and microservices
provide a scalable ecosystem for developers to tap
into online capabilities without needing to focus on
building or integrating back-end capabilities.
Blockchain technology, which enables valueexchange between software, self-sovereign identity
and new ways of unbundling and bundling content
and currencies, is a large part of decentralisation
(this area of innovation can be called Web 3.0).
Figure 18: The Seven Layers of the Metaverse
Source: medium.com (link)
Global TMT Sector
15
(6) Human interface: Computer devices are moving
closer to our bodies, transforming us into cyborgs.
Smartphones have evolved significantly from their
early days and are now highly portable, alwaysconnected powerful “computers”. With further
miniaturisation, the right sensors, embedded AI
technology and low-latency access to powerful
edge computing systems, they will absorb more
and more applications and experiences from the
metaverse. Dedicated AR/VR hardware is also
coming into the market, and in the coming years
will likely evolve significantly. Beyond smartglasses,
there is a growing industry experimenting with new
ways to bring us closer to our machines such as
3D-printed wearables integrated into fashion and
clothing.
(7) Infrastructure: The infrastructure layer includes
the technology that enables our devices, connects
them to the network and delivers content. This
includes the semiconductors, battery technology,
cloud servers and storage, and 5G and Wi-Fi
transmission required. The infrastructure upgrades
on compute, connectivity and storage
supplemented by AI should dramatically improve
bandwidth while reducing network contention and
latency, with a path to 6G in order to increase
speeds by yet another order of magnitude.
Web 3.0 envisions a more decentralised metaverse
Web 3.0 envisions the internet to be based on decentralised
blockchains using token-based economics for
transactions. The new vision contrasts with Web 2.0 where
the large internet platform companies have centralised a lot
of the data and content. Web 3.0 was coined in 2014
by Ethereum co-founder Gavin Wood and in the past
decade has seen more interest as a concept across tech
companies, VCs, start-ups and blockchain advocates. A
number of virtual communities in the metaverse are forming
with a decentralised concept that may open up the rule
making of the community to a collective majority of
individuals on the platform and are also adopting the token
concept as virtual currency.
While the original “internet” Web1 was built on largely opensource standards, Web 2.0 leveraged those same open and
standards-based technologies but ended up creating large
and closed communities, often referred to as “walledgarden” ecosystems. As Jon Radoff has argued in one of his
posts, walled gardens are successful because they can
make things easy to do—and offer access to very large
audiences. But walled gardens are permissioned
environments that regulate what you can do, and extract
high rents in exchange. He argues that there are three key
features of Web 3.0 that should change this paradigm of
Web 2.0:
Value-exchange (rather than simply information
exchange). The enabling technology for valueexchange is smart contracts on blockchains. The
blockchain is a shared ledger that allows companies,
applications, governments and communities to
16
programmatically and transparently exchange value
(assets, currencies and property, etc.) with each other,
without requiring custodians, brokers or intermediaries.
The ability to programmatically exchange value between
parties is a hugely transformative development.
Self-sovereignty. An important part of Web 3.0 is
inverting the current model where one uses one’s login
details for “walled gardens” to interact with several other
online applications. Instead of having a company own
one’s identity and then granting us access to other
applications, one would own one’s own identity and
choose which applications to interact with. This can be
accomplished by using certain digital wallets. One’s
wallet becomes one’s identity, which can then allow you
to use various decentralised applications on the internet
that need to interact with one’s currencies and property.
The re-decentralisation of the internet. Currently,
there are substantial dependencies across the internet
on a small number of highly centralised applications. But
with Web3, the power shifts back to individual users,
creators and application developers with far fewer
centralised authorities to extract rents or ask permission
from. This transfer of power and ability for users to
monetise their work by certifying efforts on the
blockchain and monetising that by exchanging the work
for tokens is expected to lead to an explosion of new
creativity in the form of applications, algorithms, artwork,
music, AI/robots, virtual worlds and metaverse
experiences, with more of the rewards staying in the
hands of the owners and creators.
Metaverse still has its limitations and risk
We do see significant potential from the latest wave of
investments to upgrade to a more immersive internet but
also see some limitations and risks in the latest cycle which
pose obstacles and may limit its success.
AR/VR limitations. A truly immersive internet would
benefit from a 360-degree field of VR or with AR glasses
versus access through traditional smartphones, tablets
and PCs. VR saw a first wave of hype in 2016 with the
launch of VR headsets by many companies and many
smartphone-based VR platforms, with every major trade
show seeing long lines to experience the concept.
The first wave failed to live up to the hype with only 2
mn units shipped. A combination of early hardware
limitations included tethering to a PC, causing vertigo
and discomfort with extended use, a lack of AAA
gaming titles and content, and isolation from others
while wearing it. The VR technology is improving with
better processing and sensors, faster refresh rates,
higher resolution and high-speed WiFi eliminating the
tethering plus content should improve with the new
wave of metaverse funding. Nevertheless, VR would still
lead to some discomfort from wearing for an extended
time and isolating the user from their surroundings.
Mainstream interest in virtual worlds. The hurdle for
virtual worlds is higher as earlier communities faced
difficulty keeping up activity 24-7 and it also needs to
change user behaviour from still seeking out real-world
experiences. The virtual worlds are improving in terms of
audio/visual but still fall short on three of the five senses
(smell, taste and touch) to be fully immersed in the
experience. Some advocates are more aggressively
investing in AR technologies which bring elements of the
digital world into the real world.
Policing the communities. Social media platforms
have continuously faced issues over which content and
authors to allow and censure, and also their ability to use
AI and human monitors to take down abusive content. A
decentralised metaverse, without the scale of the
resources major internet providers have, may also
struggle to keep up with monitoring abuse on the
platform.
NFT speculation, fakes, and metaverse asset
inflation. NFTs (non-fungible tokens) represent a
unique piece of data on the blockchain that claims to
offer a certificate of authenticity or proof of ownership
though they do not restrict sharing or copying the digital
file or prevent the creation of NFTs with identical
associated files. NFTs have been associated with
transfers of artwork, in-game assets, music and sports
cards, and can be a way to pay a creator for their work.
The NFT market, though, is introducing stolen goods,
bubbles and the risk of over-saturation as more are
created. The metaverse is also drawing headlines for
rising real estate prices in some of the digital
communities and the high-priced resale of luxury bags
carrying no right to carry it into the physical world.
Global TMT Sector
17
“
Potential to transform social
interactions, leisure activities,
education and work.
18
Metaverse use cases are expanding
Metaverse would impact gaming, entertainment, work collaboration, social media,
virtual worlds, education and fitness in the near term, and will see additional use
cases over time.
By moving to a more 3D and immersive form of the internet,
the metaverse has the potential to expand opportunities for
the current internet. We highlight the use cases and current
developments around some of the major areas the
metaverse would impact including gaming, entertainment,
work collaboration, social media, virtual worlds, education
and fitness. While these are some of the current prominent
use cases, given that the metaverse is an adaptation and
evolution of the internet, use cases are, frankly, endless
although we list some additional use cases, notably in the
commercial space at the end of this section.
user to be viewed as a hologram in another place and
teleport into a remote concert or party.
Gaming: Platforms to further leverage 3D immersion
Social: Leveraging AR/VR for connectivity and
presence in the metaverse
Video games are expected to play a central role in the
metaverse as they already build on immersive experiences
with 3D graphics, VR-enabled titles, and platforms for user
creativity and built-out digital goods for use in gaming. The
overall market is exponential and is now at 3 bn users and
projected by industry players to reach 4.5 bn by 2030.
Gaming platforms can be further unleashed with the
metaverse by allowing a gamer to be embodied and moving
around in the game or building out further upgrades available
to other users. As compute power rises, deeper multi-player
gaming can be fully enabled and eventually, a user could
turn into a hologram to show up visually with someone in
another location.
Entertainment: 3D virtual option opens up for film,
television, and music content
The metaverse is stimulating new forms of entertainment,
with a likelihood that technology eventually would allow a
During the pandemic, a number of platforms developed the
major use case of virtual concerts that took the place of live
performances. The advantage of these concerts is they
could reach far more users globally than a live venue and
could also transform the singer into an avatar or the stage
into a virtual environment, with special effects and
opportunities for fan interaction. The concerts also formed a
good platform to promote merchandising for the event.
Social media is embracing the metaverse as a way to
expand connectivity with the use of AR/VR. We highlight the
efforts of social media platforms as well as some of the early
work by dating sites to introduce virtual engagements before
meeting in the real world.
Industrial: A solution for collaborative work amid
hybrid office evolution
Work in the metaverse would allow a better sense of
presence, shared physical spaces, and a productive work setup in a work-from-home set-up. The work setting can allow
collaboration to view a 3D design with others. Companies are
investing resources towards various products, including
customisable workrooms for meetings, an office space to
customise home workspace, and 2D progressive web apps to
view applications and social media through VR.
Figure 19: Gamers now approaching 3 bn global players
Total global gamers (bn)
3.50
3.00
2.50
2.00
1.99
2.11
2015
2016
2.26
2.42
2.55
2.69
2018
2019
2020
2.81
2.95
3.07
2021
2022
2023
1.50
1.00
0.50
0.00
2017
Source: Newzoo
Global TMT Sector
19
Commerce: Virtual communities opening up new
forms of commerce
The metaverse is also opening up a platform for commerce,
with creators making digital objects, offering services and
experiences, building worlds, and a place to sell both
physical and digital products. The metaverse platform goal is
for content that is purchased on one platform to be available
on other platforms. For purchases, NFTs can be securely
purchased and sold with the data stored on the blockchain.
A number of communities are setting up public spaces that
can be accessed through digital means. Most of these
networks use avatars to represent the users, though they
vary in the digital ownership, centralisation of the platform,
rules of engagement with others, ability to build and take
ownership on the ecosystem vs being on a platform from
one of the major social media companies and the ability to
create commerce activities. Some gaming platforms have
centralised environments, but several new communities are
developing around decentralisation—which places ownership
of goods on the blockchain and is opening up ownership and
control of the rules toward decentralised policymaking.
Other use cases
The above paragraphs detail use cases across various endmarkets and with examples of companies/applications
operating within that area. Given that ‘metaverse’, in some
ways, is just an adaptation/evolution of the ‘internet’, use
cases are, frankly, endless.
Manufacturing. Metaverse can help the manufacturing
process significantly, particularly in the areas of design
and product development, by facilitating an improved
relationship and interaction between business owners,
suppliers (including design companies) and customers.
Such a platform can result in a rapid production process
design, increase the number of product designs, lead to
20
more collaborative product development, and potentially
reduce the risk to quality control. In addition, customers
in the metaverse could have improved visibility into the
supply chain process with 3D representations for how
products are built, distributed, and sold. There could also
be additional opportunities for manufacturers to have
add-on digital-products (clothes/fashion, homes, cars) in
the metaverse that resemble and mimic the real-world
products.
Fitness. VR also gives the chance to transform the living
room, similar to how virtual sports gaming products got
many off the couch to play their motion-controlled sports
games. Fitness games through VR will allow more
immersion and interactive training by allowing a user to
work out in new worlds, playing against other users or
against the machine AI.
Education. Education took a 2D step through online
learning during the pandemic but still struggled with
student engagement. An upgrade to use 3D features and
VR has potential to enhance the experience and
engagement by allowing teachers to teleport their students
to a different place or time, or into a virtual classroom,
library or gymnasium with their fellow students. A headset
or glasses could enable much more active exploration of
history, biology, a visit to a museum in another location, or
an interactive class with fellow students.
Other online VR educational applications and
programmes are now cropping up. One of the
companies is offering these experiences by allowing
customisation of a virtual classroom and simulation of
presence with virtual avatars for more interactive remote
learning. The school features presentations and
documents, customisable whiteboards, notifications and
moveable Post-its, and live video and text. Spatial I/O’s
collaborative VR platform features iTeacher, which is
architecting virtual worlds for high school education to
connect on academic topics with its metaverse now
having 14 different spaces to simulate different lessons.
“
AR/VR devices to be early
beneficiaries.
Global TMT Sector
21
Hardware: Near-term focus on AR/VR
devices
We expect AR/VR devices to see strong growth over the next few years, along with
advancement in hardware, software and connectivity
While the evolution of metaverse will eventually affect almost
all aspects of hardware devices involved, and may even lead
to some dedicated ‘killer’ hardware product—like the iPhone
was for the 3G (mobile data) era—in the near future,
AR/VR devices is where the companies’ and investors’
attention is with regard to metaverse devices. With the
advancement in optics, chipset, 5G and software, AR/VR is
not only used for games but has also seen applications in
educational, industrial and medical facilities.
We expect global AR/VR headsets to deliver 42 mn units and
revenue to reach US$12.6 bn in 2025, representing 48%
shipment CAGR and 36% revenue CAGR over 2020-25.
We believe three factors could drive additional growth of the
AR/VR market in the next few years, over and above our
forecast growth rate: (1) a disruptive AR/VR headset maker
emerges amidst more applications in enterprises and the
public; (2) a better integrated hardware-software platform
emerges to drive penetration; (3) technological
improvements such as micro-LED and fast LCD enable a
better price-performance product.
Products from emerging players
In light of AR/VR’s ability to enhance gaming experience,
other industries like education, healthcare and construction
will also benefit from the rise of the headsets. We see an
increasing number of start-ups designing headsets for the
enterprise and public sector. We expect enterprise
application would be another driver for AR/VR devices.
We believe start-ups and emerging headset companies will
expand the use of AR/VR devices, and further support the
adoption of headsets for commercial use. We expect
applications outside gaming/entertainment to be the
surprise factor for headsets in the next few years.
Platform will be key in the metaverse
We will pay attention to metaverse content and application
development. Facebook renamed itself Meta and declared a
commitment to developing a metaverse ecosystem, and
launched its virtual social platform Horizon Worlds in Dec2021. Roblox is already considered a virtual metaverse
community with a successful online game creation platform
with 31 mn DAUs (daily average users) in 2021. It allows
users to build their own areas and play games with other
users.
We believe no single company is capable of building a
metaverse, and the concept would be closer to a real world
society in which everyone plays a role contributing via
different tools. Current products of leading companies are
platforms providing tools that allow users to build their own
spaces. There are many others, which have the potential to
become larger with time. Although the platforms are early
stage, they will help drive AR/VR headset demand as user
generated content (UGC) and applications are created.
Figure 20: AR/VR headset market grows behind “metaverse”
42
14,000
12,000
30
10,000
21
8,000
4,000
2,000
0
6
6
10
6
1,826
2,028
2,665
2018
2019
2020
4,000
2021E
Value (US$ mn)
Source: IDC, Credit Suisse estimates
22
12,623
15
6,000
5,536
2022E
7,055
2023E
Shipments (RHS, mn)
9,030
2024E
2025E
45
40
35
30
25
20
15
10
5
0
usually displays much more severe image blurs than
OLED.
VR technology roadmap
Common complaints against VR were on the motion
sickness caused by: (1) the screen door effect (SDE)
(visible gaps between pixels); (2) mura (colour inconsistency
of each pixel); (3) aliasing (series of square blocks instead of
curved lines); (4) latency (low chipset processing and
transmission speed); and (5) the weight of headsets. To
mitigate these issues, headset makers are advancing the
chipset, optics, display, and tracking solution to improve the
user experience.
Optics: Fresnel lenses to replace aspherical lenses
Fresnel lenses are advocated by more VR makers due to
their lighter weight and thinner centre. HTC was the first
to use Fresnel lenses in its high-end product, Vive,
followed by Oculus. To reduce the weight of VR devices
and enhance the user experience, Fresnel lenses are
more likely to be adopted by the majority.
Tracking solution: 6DoF and inside-out tracking
solution will be the basic feature
Chipset: Qualcomm XR series
Qualcomm launched its XR platform Snapdragon Extend
Reality (XR) series in 2018 as its first SoC for AR/VR
devices. The XR2 5G platform was announced in Dec2019, which is a derivative of the Snapdragon 865
(7nm). Oculus Quest 2 was the first to deploy the
chipset, with a process speed doubling that of
Snapdragon 835 in Quest 1. The XR2 enables 2x more
video bandwidth, 6x higher resolution, and 11x AI
improvement. The strong performance has attracted
most of the major headsets makers including HTC,
PICO, DPVR, Lenovo, and Microsoft Hololens.
Display: Fast-LCD is becoming mainstream
The display in the HMD (head-mounted device) for VR is
usually one or two pieces of LCD (liquid crystal display)
or OLED (organic light-emitting diode) panel, depending
on the design of the device, although a dual display
system is preferred as human beings have two eyes.
Although LCD and OLED are both applicable for VR
displays, the majority of recently announced VR models
have adopted LCD due to its cost advantage. OLED
display was first used on VR devices by Sony’s PSVR,
Oculus Rift, and HTC Vive due to its fast response time,
which reduces motion image blur significantly. However,
OLED is limited by its inadequate lifetime and higher
cost/lower yield for higher resolution. Normal LCD has
the characteristic of high resolution, high brightness,
long lifetime and low cost, but its response time is
~100x slower than that of OLED. As a result, LCD
6 degrees of freedom (6DoF) is an upgrade of 3DoF
which adds rotational axes for rolling, yawing and
pitching. 3DoF is enough for basic applications such as
VR movies, but for a complete immersive experience
like gaming, healthcare and training, 6DoF is required.
Most of the newly released VR headsets in 2021
employed 6DoF, and we expect the 6DoF function to
be a basic feature for VR in the future.
In inside-out positional tracking, the camera or sensors
are located on the headset being tracked (e.g. Oculus);
while sensors in the outside-in scenario are placed in a
stationary location (e.g. PSVR). We believe inside-out
will be used for most VR devices for its mobility and
flexibility, whilst outside-in will likely be used in specific
scenarios such as healthcare VR and console-powered
VR for its accuracy.
Figure 21: Major VR specs
Best Sellers
Tracking
Quest 2
Neo 2
Qualcomm
Qualcomm
Snapdragon XR2
New release in 2021
Qualcomm
Snapdragon 845
Snapdragon XR1
Snapdragon XR2
Snapdragon 845
Snapdragon XR2
Standalone VR
Standalone VR
Standalone VR
PC-powered VR
PC-powered VR
Standalone VR
Standalone VR
Standalone VR
Phone-powered VR
Standalone VR
Display Type
Fast LCD
Fast LCD
Fast LCD
2 x Fast LCD
2 x Fast LCD
Fast LCD
Fast LCD
2 x Fast LCD
2 x Fast LCD
Fast LCD
Resolution
1832×1920
2048×2160
1920×2160
2160×2160
1440×1600
1832×1920
1920×2160
1600×1600
1600×1600
1280×1440
Refresh Rate
120 Hz
75 Hz
72 Hz
90 Hz
144 Hz
90 Hz
72 Hz
75 Hz
90 Hz
90 Hz
Optics
Fresnel lenses
Fresnel lenses
Fresnel lenses
Fresnel lenses
Fresnel lenses
89° horizontal
101° horizontal
93° vertical
101° vertical
Controllers
6 DoF
6 DoF
Solutions
Inside-out
Inside-out
503g
670g
$299
$699
Weight
(with headstrap)
Price
(with controllers)
Huawei 6DoF
DPVR
Qualcomm
Reverb G2
Valve Index
HTC
Qualcomm
Field of View
DPVR P1 Pro 4K
HP
Qualcomm
Type
Optics
Pico
Nolo Sonic
Processor
Display
Oculus
Pico Neo 3
Brand
Vivo Flow
P1 Pro Light
Qualcomm
Snapdragon 821
Fresnel lenses
98° horizontal
101° horizontal
100° diagonal
107° diagonal
107° diagonal
90° vertical
90° vertical
100° diagonal
90° diagonal
100° diagonal
3 DoF
6 DoF
6 DoF
6 DoF
6 DoF
6 DoF
6 DoF
3 DoF
Inside-out
Inside-out
Inside-out
Inside-out
Inside-out
Inside-out
340g
498g
809g
620g
502g
189g
188 g
410g
$349
$599
$499
$390
$470
$499
$620
$399
Source: Credit Suisse
Global TMT Sector
23
rate and costs. Organic light emitting diodes on silicon
(OLEDoS), digital light processing (DLP), and liquid
crystal on silicon (LCOS) are the three main pathways:
AR’s Technology Roadmap
Mark Zuckerberg called AR glasses “one of the hardest
technical challenges of the decade.” Due to the complexity
of the architecture, the price of AR glass is still high, which
hinders penetration. For example, one of the products from
a leading industry player sells for US$3,500 and the majority
of the users are enterprises. Unlike VR, in which most
components are becoming mainstream specs, AR makers
are still exploring different architectures:
(1) OLEDoS is fabricated on silicon wafers instead of
glass substrates and polyimide substrates.
(2) DLP is a popular solution for projectors, using
micro mirrors (DMDs) which are positioned in a
semiconductor chip to reflect light, and directs red,
green, blue light to the imagers.
(3) LCOS technology is a variation of LCD technology,
separating light into red, green and blue
components and reflecting the light off the chip
surface to LCD cells. A CMOS chip is under the
chip surface to control voltage on square reflective
aluminium electrodes.
Display: DLP, LCOS, Si-OLED are mainstream and
MicroLED would likely be the ultimate solution
AR glasses require compact and power-efficient
displays with very high contrast and brightness. We had
a leap in AR display technology achieving these
objectives, yet there is still room to improve on the yield
Figure 22: Popular AR specs
Popluar AR glass
MAD GazeMAD
Brand
Gaze Glow Plus
Processor
NVIDIA Parker SOC
Phone-powered AR
Standalone AR
Standalone AR
Display Type
2 x OLED
LCOS
Resolution
1920×1080
1280×720
Refresh Rate
Optics
Tracking
Microsoft
HoloLens 2
Vuzix 4000
Qualcomm
Qualcomm Snapdragon
Snapdragon 850
XR1
Standalone AR
Xiaomi Smart
Glasses
Dream Glass 4K
Quad-core ARM CPU
Rockchip Mali T864
Standalone AR
Standalone AR
Standalone AR
INMO Air
Nreal Light
Quad-core ARM
ThinkReality A3
Qualcomm Snapdragon
CPU
Standalone AR
Lenovo
XR1
Phone-powered AR PC-powered AR
2 x Light engine
DLP
2 x Micro LED
1280×960
1440×936
854×480
640×480
1920×1080
2 x OLED
1920×1080
122 Hz
60 Hz
50 Hz
60 Hz
60 Hz
Birdbath optics
Waveguides
Waveguides
Waveguides
Waveguides
Waveguides
Curved mirrors
Field of View
53° diagonal
40° diagonal
50° diagonal
52° diagonal
28° diagonal
29° diagonal
46° diagonal
Controllers
6 DoF Inside-out
3 DoF
6 DoF inside-out
6 DoF inside-out
3 DoF
3 DoF
3 DoF
3 DoF
6 DoF inside-out
6 DoF inside-out
80g
96g
316g
556 g
51 g
185g
78g
41g
130g
$599
$3,000
$2,295
$3,500
$599
$349
$499
(with headstrap)
Price
(with controllers)
Source: Credit Suisse
$2,499
Waveguides
Birdbath optics
1920×1080
Optics
Weight
24
Magic Leap 1
Amlogic S905D3
Type
Display
Rokid Glass 2
Birdbath optics
52° diagonal
Initially, LCOS was the major technology for AR, for the
high brightness, but it was not energy- or cost-efficient.
OLED has limitations with brightness; however,
breakthroughs in OLED material such as silicon substrate
bridged the gap. Therefore OLEDoS (OLED on silicon) is
now becoming the most popular technology due to its
merits of higher contrast, power efficiency, thickness,
wider temperature range, and faster response time.
Although there are still many hurdles to achieving mass
production, we believe micro-LED would be the ultimate
solution for AR glasses due to its super-high brightness
and contrast, excellent temperature endurance, fast
response time, and low energy consumption.
Optics: Waveguide becomes the major architecture
The industry is developing two different approaches in
waveguide technology:
(1) Diffractive waveguides are considered the most
mature technology, used in HoloLens 2, Magic
Leap 1 and Vuzix M4000. A diffractive optical
element (DOE) or holographic optical element
(HOE) are used to inject the light over a small area
into the waveguide and extract it to the user’s eyes.
The diffractive method disperses wavelengths, so
separate waveguides are used. Mainstream
products are leaning towards the use of two layers
of waveguides for thinner glasses.
(2) Reflective waveguide designed by the Israeli AR
company, Lumus, does not require nano photonics.
This methods employs a 1D or 2D semitransparent mirror along the optical path to guide
the light to users’ eyes.
Birdbath is another solution for AR optics. It contains a
spherical mirror/combiner (part-mirror) and a beam splitter.
The method works like a birdbath in which it projects light
from the OLED into the beam splitter, at a 45-degree angle
with the OLED light source plane. Lenovo Mirage AR and
ODG R9 were two examples adopting this method, but it
Figure 23: Each display technology excels in different
areas
Item
OLED
(Glass base)
LCD
(Glass base)
Micro Display
(Silicon base)
Brightness
★★
★★
★★★
~1000cd/m2
~1000cd/m2
★★★
★*
Contrast Ratio
>100,100:1
has two major downsides being light loss and double image.
Curved mirror (adopted by DreamWorld and Leap Motion) is
the cheapest see-through display technology. It is based on
semi-reflective curved mirrors placed in front of the eye. The
major advantage was the low cost because it works with
LCD, but suffers from a high degree of distortion, low image
resolution and less comfort.
Waveguide and micro-LED solutions are at the early stage; it is
very difficult for this solution to be mass produced at a low cost,
but companies such as Glo, VueReal, BOE, and AUO are
investing heavily in micro-LED technology. With the
advancement of mirco-LED technology, we expect the solution
will be adopted by most of the AR makers in the future.
For AR glasses, Meta has Project Nazare, its first AR
glasses that allow augmented reality overlays on the real
world. AR requires integration of hologram displays,
projectors, batteries, radios, custom silicon, cameras,
speakers and sensors to map the world into glasses that are
5mm thick. It also introduced its Ray Ban Stories—which
allows taking pictures or phone calls, listening to music, and
watching videos—at US$299.
Meta also has a Project Cambria for new high-end glasses.
With Project Cambria, Meta would integrate “high-resolution
coloured mixed-reality pass-through” glasses, which
combine an array of sensors with reconstruction algorithms
to represent the physical world in the headset, with a sense
of depth and perspective. The representation on the display
with these innovations is finally getting closer to representing
what the eyes see in the physical world. For optics, the
company is developing pancake optics by folding light to
achieve a slimmer profile than current lenses.
Vuzix, an AR headset company founded in 1997, recently
unveiled its Vuzix Shield which contains battery, computer,
cameras and display projector in the temples of the glasses
but can be worn all day.
Figure 24: LCD is the best technology for 2-3 inch
display
★★★
>100,100:1
Resolution
★
★★★
★★★
Display Life
★★
★★★
★★★
Cost
★★
★★★
★
Yield rate
★★
★★★
★
*(can improve to ★★★ using local dimming technology)
Source: JDI, Credit Suisse
Source: JDI
Global TMT Sector
25
“
Semiconductors are becoming the
staples of the new data economy.
26
Semiconductors are a levered
metaverse play
Since 1977, the global population has grown 86% from 4.2 bn to 7.9 bn while per
capita chip units have grown ~70x from two to 146—a massive accomplishment
even before considering the integration of functionality per chip
In addition to rising entry barriers, slowing supply and a
better demand mix increasing the global semi revenue
CAGR from 3-5% to 6-8%, we see an additional 100-300
bp of potential CAGR upside based on our view that semis
are the most levered play on the move to a data-driven
economy. Relative to the semiconductor ecosystem, we
divided our data thesis into four separate areas:
(1) Data creation/capture;
(2) Data storage;
(3) Data transmission; and
(4) Data analytics.
We argue that while data analytics is the most important
area—if you can’t analyse, you can’t monetise—we also
argue that each area is self-perpetuating: the more data you
can analyse, the more you want, which drives demand for
data creation/capture which in turn drives demand for
storage and transmission. While our Data Thesis Paradigm
is now well understood by investors (we first introduced it in
2010), there is an important subtlety that is still underrecognised. Specifically, the first three areas of Create,
Store and Transmit have each benefitted from non-linear
cost declines. Moore’s Law dictates that each year it is
cheaper to create/capture data and there is a Moore’s Law
equivalent in Storage called Kryder’s Law and for
Figure 25: Data paradigm
Figure 26: Declining cost curves drive application
elasticity
Source: Credit Suisse
Source: Credit Suisse
Figure 27: Chips per capita accelerating
Figure 28: Semis have upside to ~8% of global market
cap
Indexed Chips per Capita
Indexed per Capita
Indexed Barrels of Oil per Capita
25%
1.05
20%
0.95
15%
80
60
40
Sector % of Global Market Cap
10%
20
0
Indexed Chips per Capita
Indexed Barrels of Oil per Capita
Source: Census.gov, Credit Suisse estimates
0.85
5%
0.75
0%
Energy
Tech hardware
Computer and software services
Semis
Source: Company data, the BLOOMBERG PROFESSIONAL™ service,
Credit Suisse
Global TMT Sector
27
Transmission called Butter’s Law.
It is our observation that declining cost/function always
engenders new application growth and the elasticity of
application growth has historically grown TAMs (total
addressable markets) to be significantly larger than anyone
could imagine at inception. The exception until recently has
been analytics, whose cost has mostly only ever increased
through the digital age due to the vast majority of data
created being unstructured and most analytic models
needing clean/structured data. It is for this reason that while
data has delivered >50% CAGR for almost 20 years, 98%+
of the data the world creates remains dark.
We see AI as the first technology which holds the promise
of lowering the cost of analytics, and if our hypothesis is
correct, we see significant elasticity of application driving the
TAM for silicon. Defining the TAM for AI is difficult, but AI
will be a technology that corporations use in order to drive
efficiencies. The three largest spends for corporations are
COGS, opex and capex and it is interesting to highlight that
global spend in those three areas is ~US$45 tn/year—a
1% value capture by semis would imply US$450 bn in
incremental revenue against our CY21 semi revenue
forecast of ~US$500 bn.
Equally important, while AI would likely create brand new
TAMs for both silicon and software, it has the potential to be
disruptive/deflationary to existing TAMs in the economy: i.e.,
not only is semi absolute growth poised to accelerate, its
relative growth profile could be even more attractive. While
software stocks seem to embed the AI opportunity, semi
stocks do not.
28
Addressing the world’s most consequential challenges is
expected to require more not less silicon, and there are
fewer companies with the IP and scale to produce silicon,
while those that can, continue to appreciate in value. Simply
put, semis are becoming the staples of the new data
economy—albeit with higher growth, higher returns and yes,
a higher level of cyclicality—but staples nonetheless.
Since 1977, the global population has grown 86% from 4.2
bn to 7.9 bn while during that same time period, per capita
chip units have grown ~70x from two to 146—a massive
accomplishment even before considering the integration of
functionality per chip which is a hallmark of the semi
industry. During that same time period, per capita
consumption of barrels of oil decreased ~10 pp.
In recent years, semis have increased from 1% of global
market cap to ~3%, and we see potential upside to ~8%
between now and the end of the decade. Energy is still at
~6% and peaked in 1980 at ~22%.
Forecasts for CY22 and beyond
After growing 26% YoY in CY21, we model semi revenue
growth of +15% YoY in CY22, well above consensus at 810% YoY, but supported by ASPs which, even if only flat
with CY4Q21, would grow 7% YoY. We model CY23 semi
revenue at -5% YoY to reflect an inventory correction and
modest ASP decline. We model CY30 semi revenue of
US$1 tn, a 5.8% CAGR. After growing 41% YoY in CY21,
we expect WFE (wafer fab equipment) to grow 15% in
CY22. We model WFE at -14% YoY in CY23 but see
CY30 WFE of ~US$150 bn, ~8% CAGR. We model SCE
Figure 29: Semi ASP leverage remains underappreciated
-1% CAGR
$0.48
+2% CAGR
$0.46
$0.44
$0.42
CY00
CY01
CY02
CY03
CY04
CY05
CY06
CY07
CY08
CY09
CY10
CY11
CY12
CY13
CY14
CY15
CY16
CY17
CY18
CY19
CY20
CY21E
CY22E
CY23E
$0.40
Source: SIA, Credit Suisse estimates
Figure 30: ASPs flat w/ C4Q21 support ~50% of CY22
rev YoY
Semi Revenue y/y
3YMA Semi ASP
$0.52
$0.50
services revenue of ~US$23 bn in CY22, growing to
~US$45 bn by 2030, ~9% CAGR and implying total CY30
SCE TAM approaching US$200 bn, ~2x CY21. Note that
these estimates are inclusive of TSMC’s CY4Q21 results.
By end market, we see the best CY22 growth in datacentre/cloud, auto, enterprise and wireless infrastructure
with more modest growth in infrastructure, handsets, and
flat-to-down growth in PCs/CE. Longer term, data-centre,
auto and industrial, each ~10% of semi consumption, have
the potential of doubling as a percentage of revenue by
2030.
35%
30%
25%
20%
15%
10%
5%
0%
-5%
-10%
Semi Units y/y
Semi ASPs y/y
Semi Rev y/y
Source: SIA, Credit Suisse estimates
Global TMT Sector
29
“
Telecom infra will provide the
backbone to deliver a smooth
Metaverse experience.
30
Metaverse rollout raises demand for
bandwidth
Even with modest Metaverse assumptions, data usage could easily expand more
than 20x during this decade
Streaming audio and video media are the primary-use cases
of the consumer internet today, with consumers in the
developing world accustomed to spending a majority of
waking hours consuming some sort of media. American
adults consume six hours a day of video (whether
professionally produced TV and movies, or user-generated
content on Youtube or Twitch), just under two hours a day
of audio (primarily music, but also a small but growing share
of spoken word audio content), more than an hour of social
media and nearly four hours of other internet content. Video,
with its high bitrate, makes up four-fifths of all consumer
internet traffic on its own. (Video games are not specifically
measured by Nielsen, but generally use little data and
comprise only a mid-single-digit share of consumer internet
traffic.)
Figure 31: Nielsen’s digital metering shows 10+ hours a day of media use by American adults, led by video and
followed by audio and browsing
American Adults’ Daily Hours of Media Use
14
12
10
10.2
9.9
10.2
10.3
3.4
1.8
2.7
0.8
1.8
5.3
4.8
4.9
5.2
1Q17
2Q17
8
3.0
3.2
6
1.8
1.8
4
2
11.3
10.5
10.7
2.8
0.8
1.8
2.6
0.7
1.8
2.7
0.8
1.8
6.0
5.4
5.4
11.5
11.6
11.9
12.1
12.6
3.7
10.3
3.1
3.1
3.6
3.8
0.9
1.8
0.9
1.8
1.1
1.8
1.1
1.8
1.1
1.8
2.5
0.8
1.6
5.8
5.8
5.4
5.4
6.0
5.3
0
3Q17
4Q17
Total Video
1Q18
2Q18
3Q18
4Q18
1Q19
2Q19
3Q19
4Q19
1Q20
Total Audio
Total Social Networking
Other Internet
Total Media Use
2Q20
3Q20
Note: Not all media types available for all periods. Includes simultaneous multi-media usage.
Source: Company data, Nielsen Total Audience Reports (1Q17-3Q20), Credit Suisse estimates
Figure 32: The American Time Use Survey shows media use comprises two-thirds of all leisure time. (The ATUS
records primary activities so its lower-than-Nielsen media use is attributable to media use as a secondary activity,
such as listening to a show while doing chores.)
Sleep, Work/School, Eating/Chores, and Leisure Hrs/day
12
25
5.25 5.18 5.21 5.37 5.26 5.30 5.21 5.13 5.24 5.27 5.19
10
15
6.09 6.19 6.03 5.88 6.06 5.87 5.96 5.97 5.88 5.88 5.91
8
10
3.99 3.97 4.04 4.03 3.94 4.01 3.99 4.10 4.07 4.03 4.07
5
8.67 8.67 8.71 8.73 8.74 8.80 8.83 8.79 8.80 8.82 8.84
20
’09 ’10 ’11 ’12 ’13 ’14 ’15 ’16 ’17
Sleeping
Work+School
Eating, Chores, all other
’18 ’19
Leisure
5.25 5.18 5.21 5.37 5.26 5.30 5.21 5.13 5.24 5.27 5.19
6
4
2
0
0
Hours per day of Leisure Time – TV, Other Media, All Other
(note different scale)
1.71 1.74 1.72 1.79 1.74 1.71 1.69 1.70 1.73 1.71 1.68
0.72 0.71 0.74 0.75 0.75 0.77 0.74 0.70 0.74 0.72 0.70
2.82 2.73 2.75 2.83 2.77 2.82 2.78 2.73 2.77 2.84 2.81
’09 ’10
watching tv
’11 ’12 ’13 ’14 ’15 ’16 ’17
other media use
all other leisure
’18 ’19
Leisure
Note: (1) All respondents aged 15+, weekdays + weekends. (2) ‘Other Media’ is reading, playing games, and computer use other than for games.
Source: Company data, BLS American Time Use Survey, Credit Suisse estimates
Global TMT Sector
31
world as well as modelling, engineering and architecture
programmes such as Enscape. Current efforts such as
Oculus by Meta aim to push VR into headsets with
onboard computation capability that eliminate
console/PC dependency. Apple is widely speculated to
be introducing its standalone VR headset in 2022.
XR places extraordinary demands on networks
Extended reality (XR) technologies are a fundamental
building block of the metaverse.
Virtual Reality (VR) combines visual inputs for each
eye with positional tracking to present the user with a
sense of spatial immersion in the displayed world,
perhaps with haptic touch and even environmental
feedback. VR experiences are typically currently
delivered with a dedicated headset connected to a
latest-generation console or high-end PC; they include
AAA video games such as Fallout 4 VR and Half-Life:
Alyx in which the player is fully immersed in the game
Augmented Reality (AR) overlays images onto the
real world, such as onto a transparent display surface.
AR applications include Niantic’s hit 2016 mobile game
Pokémon Go and Snapchat’s Lenses feature which can
modify users’ faces, bodies, or backgrounds. AR
experiences are becoming widely available.
Figure 33: With high time spent and high bitrates, streaming video has become the primary driver of consumer
internet traffic
Estimated share of Global Consumer Internet Traffic
100%
4%4%
20%
13%
80%
80%
60%
60%
40%
80%
40%
79%
20%
20%
0%
Global Mobile Network Traffic by Application Type
100%
1%
6%
6%
13%
2%
4%
3%
13%
10%
1%3%1%
7%
13%
1% 3%2%
4%
9%
20%
28%
13%
77%
69%
52%
2016
Social Networking
File Sharing
Video
Audio
40%
20%
29%
2011
Mobile
Web, Email, and Data
File Sharing
80%
60%
10%
0%
Fixed
Internet Video
Online Gaming
100%
2021E
Software Update
Other
0%
2026E
Web Browsing
Source: Company data, Cisco VNI 2018, Credit Suisse estimates Source: Company data, Ericsson, Credit Suisse estimates
Figure 34: Streaming video usage growth has driven consistent 30% per annual bandwidth usage growth, but
metaverse applications might substantially accelerate traffic needs over time
North America Consumer Internet Traffic, EB/Mo., and %Y/Y
378
400
160%
350
140%
291
300
120%
224
250
100%
172
200
80%
132
150
60%
102
77
100
40%
55
43
35
50 14 19 26
20%
0
0%
US per Household Wireline and Mobile Data Usage, GB/Mo
1,866
2,000
1,496
1,215
987
802
1,000
16.0x
651
15.3x
514
399
15.6x
280 326
15.9x
500
16.3x
130 171 221
16.3×16.6x
17.1x
1
7.8x
18.2×19.6x
022.7×18.0x
98 117
6 9 12 14 18 23 31 39 49 62 78
1,500
Wireline
Mobile
Source: Company data, Cisco VNI, Ericsson, US Census Bureau, Credit Suisse estimates
Figure 35: XR requires higher speeds and lower latency than browsing and streaming
Video Gaming
Source: Highspeedinternet.com, Restream.io, Credit Suisse estimates
32
25
Mbps
2
Mbps
Video
AR
50
Mbps
25
Mbps
Low end
5
Mbps
High end
1.5
Mbps
Low end
35
Mbps
4K
Low end
Music streaming
0.7
Mbps
HD
7
kbps
SD
1
Mbps
Streaming
850
kbps
High end
128
kbps
Web browsing
Remote desktop
0
80
kbps
High end
100
Low end
200
High end
200
Mbps+
300
VR
300
Mbps
Holograms
Bitrate Requirements by Application
Mixed Reality (MR) combines features of both VR and
AR, with manipulable fully 3D virtual objects anchored in
real space, while hologram technologies such as the
work of Lightfield Labs use a similar fully 3D data structure
but a very different display technology to present an actual
image in real 3D space rather than merely the illusion of
one to a user with a specialised headset.
VR and MR experiences place high demands on both
devices and networks. Each eye generally has its own
separate display and the displays must be high resolution
because they are so close to the eyes. Moreover, the
displays generally have a high refresh rate to avoid possible
user motion sickness, typically with 72, 90, or 120 frames
per second (FPS) compared to the 24 FPS norm for video
content. Apple has also been speculated to have a third
display to increase immersion, as well as several cameras to
gauge the environment and user activity.
Nascent video game streaming services suggest an
interesting parallel for XR. Video games’ rendering work is
traditionally done on-device, so that the best-quality levels
require a powerful console or purpose-built PC with gamingspecific hardware. Video game streaming services such as
Google Stadia and Nvidia GeForce Now instead offload such
rendering tasks to a remote compute cloud—allowing the
highest level of quality on weaker devices such as
smartphones. Video game streaming has higher download
requirements than streaming video (games require a higher
frame rate than video content’s 24 FPS to give a smooth
illusion of motion), and the services are also less latency- and
loss-tolerant (to provide satisfying responsiveness to inputs).
To an even greater extent than AAA video games, VR and
MR are highly computationally intensive. These experiences
Figure 36: Buffering—building a cache as data arrives, then playing it out evenly to provide a smooth stream—hides
latency, jitter and packet loss. This means that streaming video/audio user experience is generally little affected by
network quality.
Application network demands
Streaming
Metaverse
Video
gaming On-device Cloud
Video
Audio
Bandwidth needs
High
Low
Low
Latency tolerance
High
High
Low
Low
Low
Loss tolerance
High
High
Low
Low
Low
Device compute
Low
Low
High
High
Moderate
Edge/cloud compute
Low
Low
Moderate
High
Very High
Moderate Very High
Source: SIA, Credit Suisse estimates
Global TMT Sector
33
can still be delivered on simpler, lighter, and cheaper enduser devices if computationally intensive tasks can be
offloaded to a cloud compute instance. To the extent this is
possible, it would drive down the cost of XR devices and
allow mass adoption. Verizon has estimated that any more
than 20ms of motion-to-photon (total stack) latency causes
many users to become nauseated; for comparison, well-built
wireline broadband networks today typically have 20ms of
network latency alone, and typical LTE latencies are 3x
higher.
For metaverse (network) extended-reality compute to be
offloaded, the entire process must be shortened so that
input from the user device, a network trip, processing by the
service, a return network trip and drawing the output on the
user device fits in the 20ms time taken by a single network
trip today. This would require driving down network latency
significantly from the current ~20ms levels with careful
network engineering, relying on the capabilities of lowlatency DOCSIS and 5G.
Figure 37: Even with modest metaverse assumptions, data usage could easily expand more than 20x during this
decade—and metaverse traffic cannot buffer like streaming video
Household data diet: Indicative metaverse data consumption
Daily
Monthly
usage
usage
2022
Video usage (hrs)
15.0
450.0
GB/hr
1.4
1.4
Video usage, GB
21.3
640.0
All other data
Total data usage
2032
Metaverse usage, hrs
GB/hr
Metaverse usage, GB
Video usage, hrs
GB/hr
Video usage, GB
All other data
Total data usage
5.3
160.0
26.7
800.0
5.0
112.0
560.0
150.0
112.0
16800.0
10.0
7.0
70.0
300.0
7.0
2100.0
13.8
415.0
643.8
19315.0
Notes and assumptions
Indicative 6 hrs/adult/day and 2.5 members per HH
SD is 700MB, FHD is 3GB
Web browsing, audio streaming, gaming
1/4 of all video usage
Two streams each 8K 120Hz (8 times higher than 4K 60Hz bitrate)
Conservatively 1:1 substitution (no increase in total media consumption)
Assumes average video content consumed is 4K
3.3x, or 13% CAGR
2.6x, or 10% CAGR
24.1x, or 37% CAGR
Source: Company data, Credit Suisse estimates
Figure 38: 6G to perhaps provide wireless networks with greater metaverse capabilities, but deployment timeframes
to lag fixed networks
Mobile technology generations
Typical DL
speed
(mbps)
Spectral
efficiency
(bps/Hz)
Latency
(ms)
1979
0.002
0.5
1,000
1991
0.1
1.3
600
3G
2001
8
2.6
65
4G
2009
30
4.3
50-60
Streaming video and audio, ride hailing/maps, rich social networking
4.5G
(LTE-A)
2014
100
15
Consistent streaming video and audio experience, augmented reality,
fixed wireless access
5G
2019
300
30
30-50
10 (theor.)
21-26
1 (theor.)
6G
2029?
1,000+
100+
Gen.
Year
introduced
1G
2G
Source: Company data, CTIA, Credit Suisse estimates
34
Use cases
Analog voice
Digital voice, SMS
Email, basic browsing, early IoT
Better streaming media, virtual reality, massive IoT
VR, mobile metaverse, massive digital twinning
Market Commentary Disclaimer:
While this post has been prepared by the Securities Research business of Credit Suisse AG, its subsidiary or affiliate (“CS”) and may
contain references to Securities Research reports and/or Securities Research analysts, it is for information only and does not constitute
research by CS. Furthermore, this post is not to be regarded as a sales prospectus or an offer or solicitation of an offer to enter in any
investment activity. This post does not take into account your specific investment objectives and needs nor your financial situation. No
representation or warranty, either expressed or implied is provided in relation to the accuracy, completeness or reliability of the information
contained herein, nor is it intended to be a complete statement or summary of the developments referred to in this post and any liability
therefore (including in respect of direct, indirect or consequential loss or damage) is expressly disclaimed. The information and any opinions
expressed in this post are subject to change without notice and may differ or be contrary to opinions expressed by other business areas or
groups of CS as a result of using different assumptions and criteria. CS is not providing any financial, economic, legal, accounting, or tax
advice or recommendations in this post. In addition, the receipt of this post alone by you is not to be taken to constitute you as a client of
any CS entity. CS is under no obligation to update or keep current the information contained in this material. Please consult with your client
advisor before taking any investment decision. This material is issued and distributed in the United States by CSSU, a member of NYSE,
FINRA, SIPC and the NFA, which accepts responsibility for its content. Clients should execute transactions through a Credit Suisse entity
in their home jurisdiction unless governing law permits otherwise.