🎉 [Gate 30 Million Milestone] Share Your Gate Moment & Win Exclusive Gifts!
Gate has surpassed 30M users worldwide — not just a number, but a journey we've built together.
Remember the thrill of opening your first account, or the Gate merch that’s been part of your daily life?
📸 Join the #MyGateMoment# campaign!
Share your story on Gate Square, and embrace the next 30 million together!
✅ How to Participate:
1️⃣ Post a photo or video with Gate elements
2️⃣ Add #MyGateMoment# and share your story, wishes, or thoughts
3️⃣ Share your post on Twitter (X) — top 10 views will get extra rewards!
👉
AI Empowering the Crypto Assets Industry: From the Industry Chain to Innovative Applications
The Fusion of AI and Crypto Assets: From Zero to Peak
Artificial intelligence technology has made groundbreaking progress in recent years and is regarded by some as the fourth industrial revolution. The emergence of large language models has significantly improved efficiency across various industries, with Boston Consulting Group estimating that GPT has increased work efficiency in the U.S. by approximately 20%. At the same time, the generalization ability of large models is seen as a new software design paradigm, distinct from traditional precise coding, where modern software design increasingly embeds the generalized large model framework into software, providing better performance and broader modal input-output support. Deep learning technology has indeed brought a new wave of prosperity to the AI industry, and this wave has also extended to the Crypto Assets industry.
Development History of the AI Industry
The AI industry began in the 1950s. To realize the vision of artificial intelligence, academia and industry have developed various schools of thought based on different disciplinary backgrounds at different times.
Modern artificial intelligence technology mainly adopts the "machine learning" method, which involves allowing machines to rely on data to iteratively improve system performance in tasks. The main steps include: inputting data into algorithms, training models with data, testing and deploying models, and using models to complete automated prediction tasks.
Currently, there are three main schools of thought in machine learning: connectionism, symbolic AI, and behaviorism, which respectively imitate the human nervous system, thinking, and behavior. Connectionism, represented by neural networks, is dominant ( also known as deep learning ). The architecture of neural networks includes an input layer, an output layer, and multiple hidden layers. When the number of layers and the number of neurons ( parameters ) are sufficient, they can fit complex general tasks. By continuously adjusting the neuron parameters through data input, the neurons will eventually reach an optimal state ( parameters ).
Deep learning technology has also undergone multiple iterations and evolutions, from the earliest neural networks to feedforward neural networks, RNNs, CNNs, GANs, and finally developing into modern large models such as those using Transformer technology like GPT. The Transformer technology is just one evolutionary direction of neural networks, adding a converter to encode data from various modalities ( such as audio, video, images, etc. ) into corresponding numerical representations, which are then inputted into the neural network, allowing the neural network to fit any type of data and achieve multimodality.
The development of AI has gone through three technological waves:
In the 1960s, the first wave triggered by the development of symbolic technology addressed the issues of general natural language processing and human-computer dialogue. At the same time, expert systems were born.
In 1997, IBM's Deep Blue defeated chess champion Garry Kasparov, marking the second peak of AI technology.
In 2006, the concept of deep learning was proposed, initiating the third wave of technological revolution. Deep learning algorithms gradually evolved from RNN, GAN to Transformer and Stable Diffusion, shaping the heyday of connectionism.
Deep Learning Industry Chain
Currently, large models generally adopt deep learning methods based on neural networks. Large models represented by GPT have triggered a new wave of artificial intelligence enthusiasm, with a large number of players flooding into this track and a surge in market demand for data and computing power. Therefore, we focus on exploring the industrial chain of deep learning algorithms, analyzing how the upstream and downstream are composed in the AI industry dominated by deep learning, as well as the current situation, supply and demand relationships, and future development of the upstream and downstream.
The training of large language models like GPT based on Transformer technology ( LLMs ) is mainly divided into three steps:
Pre-training: Input a large amount of data pairs to find the optimal parameters for each neuron in the model. This is the most computationally intensive process and requires repeated iterations to try various parameters.
Fine-tuning: Use a small amount of high-quality data for training to improve the quality of model output.
Reinforcement Learning: Establish a "reward model" to rank the output results of the large model for the automatic iteration of the model parameters. Sometimes, human participation is also needed to evaluate the quality of the model output.
The three main factors affecting the performance of large models are the number of parameters, the amount and quality of data, and computing power. These three elements give rise to an entire industrial chain.
hardware GPU provider
Currently, Nvidia is in an absolutely leading position in the AI GPU chip design field. The academic community mainly uses consumer-grade GPUs like the RTX series (, while the industrial sector mainly uses chips such as H100 and A100 for the commercial implementation of large models.
In 2023, Nvidia's latest H100 chip was heavily ordered by multiple companies as soon as it was released. The global demand for the H100 chip far exceeds supply, with a delivery cycle reaching 52 weeks. To reduce dependence on Nvidia, Google has led the establishment of the CUDA Alliance with companies like Intel, Qualcomm, Microsoft, and Amazon to jointly develop GPUs.
![Newcomer Science Popularization丨AI x Crypto: From Zero to Peak])https://img-cdn.gateio.im/webp-social/moments-f37fb0100218188368f4d31940aab2a3.webp(
) cloud service provider
Cloud service providers offer elastic computing power and managed training solutions for AI companies with limited funding after purchasing enough GPUs to build high-performance computing clusters. The market is currently divided into three main types of cloud computing power providers:
Traditional cloud vendors represented by large-scale cloud computing platforms ### such as AWS, Google Cloud, Azure (
The cloud computing power platform for vertical tracks is mainly designed for AI or high-performance computing.
Emerging inference as a service providers mainly deploy pre-trained models for clients and conduct fine-tuning or inference.
![Newbie Science Popularization丨AI x Crypto: From Zero to Peak])https://img-cdn.gateio.im/webp-social/moments-8848582a34ba293d15afae15d90e3c95.webp(
) Database Provider
For AI data and deep learning training inference tasks, the industry mainly uses "vector databases". Vector databases can efficiently store, manage, and index massive high-dimensional vector data, unifying the storage of unstructured data in the form of "vectors".
Main players include Chroma, Zilliz, Pinecone, Weaviate, etc. As data demand increases and large models and applications in various subfields emerge, the demand for vector databases will grow significantly.
![Newbie Science Popularization丨AI x Crypto: From Zero to Peak]###https://img-cdn.gateio.im/webp-social/moments-53c48daf49a3dbb35c1a2b47e234f180.webp(
) edge devices
When building a GPU high-performance computing cluster, a large amount of energy will be consumed and heat will be generated. To ensure the continuous operation of the cluster, cooling systems and other edge devices are required.
In terms of energy supply, electricity is mainly used. Data centers and supporting networks currently account for 2%-3% of global electricity consumption. BCG predicts that by 2030, the electricity consumption for training large models will triple.
In terms of heat dissipation, air cooling is currently the main method, but liquid cooling systems are receiving significant investment. Liquid cooling is mainly divided into three types: cold plate, immersion, and spray.
![Newbie Science Popularization丨AI x Crypto: From Zero to Peak]###https://img-cdn.gateio.im/webp-social/moments-250a286e79261e91a0e7ac4941ff5c76.webp(
) AI applications
Currently, the development of AI applications is similar to the blockchain industry, with very crowded infrastructure, but application development is relatively lagging. Most of the currently active AI applications are search-type applications, which are relatively homogeneous.
The user retention rate of AI applications is generally lower than that of traditional internet applications. In terms of the proportion of active users, the median DAU/MAU for traditional internet software is 51%, while the highest for AI applications is only 41%. Regarding user retention rates, the median for the top ten traditional internet software is 63%, while ChatGPT's retention rate is only 56%.
![Newcomer Science Popularization丨AI x Crypto: From Zero to Peak]###https://img-cdn.gateio.im/webp-social/moments-8358c377eb2c07467848b3b46dbf1056.webp(
The Relationship Between Crypto Assets and AI
Blockchain technology has evolved into the ideas of decentralization and trustlessness thanks to the development of technologies such as zero-knowledge proofs. Essentially, the entire blockchain network is a value network, with each transaction being a value conversion based on the underlying tokens. Token economics defines the relative value of the ecosystem settlement asset ) native token (.
Token economics can assign value to any innovation and existence, whether it is an idea or a physical creation. This means of redefining and discovering value is also crucial for the AI industry. Issuing tokens in the AI industry chain allows for value reshaping at various stages, incentivizing more people to delve into the sub-sectors of the AI industry. Tokens can also feed back into the ecosystem, promoting the birth of certain philosophical ideas.
The immutability and trustlessness characteristics of blockchain have practical significance in the AI industry, enabling the realization of applications that require trust. For example, it ensures that models do not know the specific content of user data when using it, do not leak data, and return genuine inference results. When there is a shortage of GPU supply, distribution can be done through the blockchain network; when GPUs are iterated, idle GPUs can contribute computing power to the network, thereby regaining value.
![Newcomer Science Popularization丨AI x Crypto: From Zero to Peak])https://img-cdn.gateio.im/webp-social/moments-c8845a5920048e7c821c939e2d8304e7.webp(
Overview of AI-Related Projects in the Crypto Assets Industry
) GPU supply side
In the AI industry chain of the Crypto Assets sector, computing power supply is the most important link. Currently, a project with good fundamentals is Render, which is mainly used for video rendering tasks that are not large model-based.
Industry forecasts that the demand for GPU computing power will be about $75 billion in 2024 and will reach $773 billion by 2032, with a compound annual growth rate of about 33.86%. With the explosion of the GPU market and the influence of Moore's Law, there will be a large number of non-latest generation GPUs in the future, and these idle GPUs can continue to create value in shared networks.
![Newcomer Science Popularization丨AI x Crypto: From Zero to Peak]###https://img-cdn.gateio.im/webp-social/moments-2ed56db6cae1b0206e8e0daa9b1892fd.webp(
) hardware bandwidth
Bandwidth is often a major factor affecting the training time of large models, especially in the field of on-chain cloud computing. However, shared bandwidth may be a pseudo-concept because for high-performance computing clusters, data is primarily stored on local nodes, while in shared bandwidth, data is stored at a certain distance away, and the latency caused by geographical differences is much higher than that of local storage.
![Newbie Science Popularization丨AI x Crypto: From Zero to Peak]###https://img-cdn.gateio.im/webp-social/moments-c733612f35e5a4a21d8d39a0a77f85b8.webp(
) data
The currently launched Crypto Assets industry AI data provision projects include EpiK Protocol, Synesis One, Masa, etc. Compared with traditional data companies, Web3 data providers have advantages in data collection, as individuals can contribute non-private data ### and even contribute private data ( through zero-knowledge proof technology. This expands the coverage of projects, targeting not only enterprises but also enabling data pricing for any user.
![Newbie Science Popularization丨AI x Crypto: From Zero to Peak])https://img-cdn.gateio.im/webp-social/moments-b97be5c0fd9efe0b79e2e6fcd4493212.webp(
) ZKML### Zero-Knowledge Machine Learning (
To achieve privacy computing and training of data, the industry mainly adopts zero-knowledge proof schemes, using homomorphic encryption technology to perform inference off-chain, and then uploading the results along with the zero-knowledge proof on-chain. This not only ensures data privacy but also achieves efficient and low-cost inference.
In addition to focusing on off-chain training and inference projects in the AI field, there are also some general-purpose zero-knowledge projects, such as Axiom, Risc Zero, and Ritual, which can provide zero-knowledge proofs for any off-chain computation and data, thus expanding the application boundaries.
![Newcomer Science Popularization丨AI x Crypto: From Zero to Peak])https://img-cdn.gateio.im/webp-social/moments-a732f2716e6ef577c2e5817efcec3546.webp(
) AI applications
The application of AI in the Crypto Assets industry is similar to that in the traditional AI sector, with most still in the infrastructure construction phase and downstream application development relatively weak. These AI + blockchain applications are more about traditional blockchain applications combined with automation and generalization capabilities, such as AI Agents being able to execute optimal DeFi trading or lending paths based on user needs.
Fetch.AI is a representative AI Agent project. It defines an AI Agent as "a self-operating program on a blockchain network that can connect and search"