Consistency
/kənˈsɪstənsi/
noun … “All nodes see the same data at the same time.”
Availability
/əˌveɪləˈbɪləti/
noun … “System responds to requests, even under failure.”
Partition Tolerance
/ˈpɑːrtɪʃən ˈtɒlərəns/
noun … “System keeps working despite network splits.”
CAP Theorem
/kæp ˈθiərəm/
noun … “You can only fully guarantee two out of three.”
Information Theory
/ˌɪnfərˈmeɪʃən ˈθiəri/
noun … “Mathematics of encoding, transmitting, and measuring information.”
Shannon Limit
/ˈʃænən ˈlɪmɪt/
noun … “Maximum reliable information rate of a channel.”
Shannon Limit, named after Claude Shannon, is the theoretical maximum rate at which information can be transmitted over a communication channel with a specified bandwidth and noise level, while achieving error-free transmission. Formally defined in information theory, it sets the upper bound for channel capacity (C) given the signal-to-noise ratio (SNR) and bandwidth (B) using the Shannon-Hartley theorem: C = B * log2(1 + SNR).
Key characteristics of the Shannon Limit include:
Immutability
/ˌɪˌmjuːtəˈbɪləti/
noun … “Data that never changes after creation.”
Immutability is the property of data structures or objects whose state cannot be modified once they are created. In programming, using immutable structures ensures that any operation producing a change returns a new instance rather than altering the original. This paradigm is central to Functional Programming, concurrent systems, and applications where predictable state is critical.