A curated list of the most influential technology terms from the past decade.
Represents systems that learn and improve from experience without being explicitly programmed.
A subset of AI that focuses on building algorithms that improve automatically through examples.
An AI subset that uses deep neural networks to process unstructured data such as images and audio.
Complexly structured network of interconnected nodes that mimic human brain functionality.
A computing paradigm that uses quantum-mechanical phenomena to perform operations on data.
A model for enabling scalable access to computing resources over the internet.
The vast amounts of data available today that require new processing models to handle.
A network of physical objects connected to the internet to share data and functions.
A decentralized digital ledger that records transactions across a distributed network.
A type of AI that would possess human-like abilities across various tasks.
A set of technologies that enables machines to understand and process the meaning of text.
The next generation of the internet focused on decentralization, blockchain, and user sovereignty.
A technology that superimposes digital information into the real world.
A technology that creates interactive 3D environments for immersive experiences.
A computing model that brings computation and data storage closer to the location where it is used.
A combination of public and private cloud solutions providing flexibility and security.
A cloud computing model that allows developers to focus on code without managing servers.
A practice that combines software development and IT operations to deliver high-quality software faster.
A practice that enables teams to integrate code frequently and deliver updates reliably.
A design approach that breaks down applications into small, independent services.
A business model where application programming interfaces are used to connect different services.
A collaborative way of developing software where source code is shared freely.
The protection of digital infrastructure and data from unauthorized access or attacks.
A principle that ensures security is built into every aspect of a system's design.
A project management methodology that emphasizes iterative progress and frequent delivery of working software.
A framework within Agile that provides a method for managing and monitoring project progress.
A method that visualizes work and limits work in progress to enhance team efficiency.
A production philosophy that eliminates waste and maximizes value added to products.
A document that outlines the principles of agile software development.
Spiral Development Life Cycle, an alternative to traditional waterfall models.
Minimum Viable Product, a concept that allows startups to test their ideas before scaling.
A containerization platform that packages applications securely and consistently across environments.
A container orchestration tool that automates the deployment, scaling, and management of containerized applications.
A version control system that enables collaboration and tracking of changes in software development.
A platform for hosting and sharing code repositories using Git.
A series of steps that automate the integration, testing, and deployment of software.
A practice that combines software development and IT operations to deliver high-quality software faster.
A design approach that breaks down applications into small, independent services.
A computing model where the backend is managed by a third-party provider, eliminating the need for server management.
A design approach where applications react to events rather than being controlled by a central controller.
A messaging service that allows applications to queue messages for processing.
A type of storage that retains data even when power is lost or hardware fails.
A repository for raw data that can be processed and analyzed later.
A structured repository of data for reporting and analytics.
Frameworks and tools designed to manage large volumes of data efficiently.
A open-source framework for distributed storage and processing of large datasets.
A fast and flexible processing engine for big data analytics and machine learning.
A data warehouse software that facilitates data analysis via SQL queries.
A company known for its data management and enterprise-grade technologies.
A tool that allows you to define and run multi-container Docker applications.
A distributed streaming platform that handles the ingestion of large volumes of data.
A visualization tool for Elasticsearch that helps users analyze and visualize data.
A search engine developed by Elastic which can index, store, and retrieve vast amounts of data.
A field of人工智能 that focuses on the interaction between humans and computers using natural language.
Algorithms trained to recognize patterns and make predictions based on historical data.
As previously mentioned, complexly structured network of interconnected nodes that mimic human brain functionality.
Subclass of neural networks that use multiple layers to extract features from data.
A type of neural network that has a memory component allowing it to process sequential data.
A type of RNN designed to handle long-term dependencies in sequences.
A type of neural network architecture that uses selfattention mechanisms to process sequence data efficiently.
The process of teaching a machine to perform specific tasks through data.