/ˈstriːmɪŋ/

noun — "continuous delivery of data as it is consumed."

Streaming is a method of data transmission in which information is delivered and processed incrementally, allowing consumption to begin before the complete dataset has been transferred. Rather than waiting for a full file or payload to arrive, a receiving system handles incoming data in sequence as it becomes available. This model reduces startup latency and supports continuous use while transmission is still in progress.

From a systems perspective, streaming depends on dividing data into ordered segments that can be independently transported, buffered, and reassembled. A producer emits these segments sequentially, while a consumer processes them in the same order. Temporary storage, known as buffering, absorbs short-term variations in delivery rate and protects the consumer from brief interruptions. The goal is not zero delay, but predictable continuity.

Most modern streaming systems operate over standard network protocols layered on HTTP. Data is made available as a sequence of retrievable chunks, and clients request these chunks progressively. Clients measure network conditions such as throughput and latency and adapt their request strategy accordingly. This adaptive behavior allows systems to remain usable across fluctuating network environments.

Encoding and compression are central to practical streaming. Data is transformed into compact representations that reduce transmission cost while preserving functional quality. In audiovisual systems, encoded streams are decoded incrementally so playback can proceed without full reconstruction. Hardware acceleration, commonly provided by a GPU, is often used to reduce decoding latency and computational load.

Streaming extends beyond media delivery. In distributed computing, streams are used to represent ongoing sequences of events, measurements, or state changes. Consumers read from these streams in order and update internal state as new elements arrive. This approach supports real-time analytics, monitoring, and control systems where delayed batch processing would be ineffective.

Architecturally, streaming systems emphasize sustained throughput, ordering guarantees, and fault tolerance. Producers and consumers are frequently decoupled by intermediaries that manage sequencing, buffering, and retransmission. This separation allows independent scaling and recovery from transient failures without halting the overall flow of data.

A typical streaming workflow involves a source generating data continuously, such as video frames, sensor readings, or log entries. The data is segmented and transmitted as it is produced. The receiver buffers and processes each segment in order, discarding it after use. At no point is the entire dataset required to be present locally.

In user-facing applications, streaming improves responsiveness by reducing perceived wait time. Playback can begin almost immediately, live feeds can be observed as they are generated, and ongoing data can be inspected in near real time. The defining advantage is incremental availability rather than completeness.

Within computing as a whole, streaming reflects a shift from static, file-oriented data handling toward flow-oriented design. Data is treated as something that moves continuously through systems, aligning naturally with distributed architectures, real-time workloads, and modern networked environments.

See Buffering, HTTP, Video Codec.