<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc2629 version 1.6.2 (Ruby 3.0.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-kpugin-rush-01" category="info" tocInclude="true" sortRefs="true" symRefs="true" version="3">
  <!-- xml2rfc v2v3 conversion 3.12.3 -->
  <front>
    <title abbrev="rush">RUSH - Reliable (unreliable) streaming protocol</title>
    <seriesInfo name="Internet-Draft" value="draft-kpugin-rush-01"/>
    <author initials="K." surname="Pugin" fullname="Kirill Pugin">
      <organization>Facebook</organization>
      <address>
        <email>ikir@fb.com</email>
      </address>
    </author>
    <author initials="A." surname="Frindell" fullname="Alan Frindell">
      <organization>Facebook</organization>
      <address>
        <email>afrind@fb.com</email>
      </address>
    </author>
    <author initials="J." surname="Cenzano" fullname="Jordi Cenzano">
      <organization>Facebook</organization>
      <address>
        <email>jcenzano@fb.com</email>
      </address>
    </author>
    <author initials="J." surname="Weissman" fullname="Jake Weissman">
      <organization>Facebook</organization>
      <address>
        <email>jakeweissman@fb.com</email>
      </address>
    </author>
    <date year="2022" month="March" day="07"/>
    <area>General</area>
    <workgroup>TODO Working Group</workgroup>
    <keyword>Internet-Draft</keyword>
    <abstract>
      <t>RUSH is an application-level protocol for ingesting live video.
This document describes the protocol and how it maps onto QUIC.</t>
    </abstract>
    <note removeInRFC="true">
      <name>Discussion Venues</name>
      <t>Discussion of this document takes place on the
    mailing list (),
  which is archived at <eref target=""/>.</t>
      <t>Source for this draft and an issue tracker can be found at
  <eref target="https://github.com/afrind/draft-rush"/>.</t>
    </note>
  </front>
  <middle>
    <section anchor="introduction">
      <name>Introduction</name>
      <t>RUSH is a bidirectional application level protocol designed for live video
ingestion that runs on top of QUIC.</t>
      <t>RUSH was built as a replacement for RTMP (Real-Time Messaging Protocol) with the
goal to provide support for new audio and video codecs, extensibility in the
form of new message types, and multi-track support. In addition, RUSH gives
applications option to control data delivery guarantees by utilizing QUIC
streams.</t>
      <t>This document describes the RUSH protocol, wire format, and QUIC mapping.</t>
    </section>
    <section anchor="conventions-and-definitions">
      <name>Conventions and Definitions</name>
      <t>The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD",
"SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this
document are to be interpreted as described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/>
when, and only when, they appear in all capitals, as shown here.</t>
      <dl>
        <dt>Frame/Message:</dt>
        <dd>
          <t>logical unit of information that client and server can exchange</t>
        </dd>
        <dt>PTS:</dt>
        <dd>
          <t>presentation timestamp</t>
        </dd>
        <dt>DTS:</dt>
        <dd>
          <t>decoding timestamp</t>
        </dd>
        <dt>AAC:</dt>
        <dd>
          <t>advanced audio codec</t>
        </dd>
        <dt>NALU:</dt>
        <dd>
          <t>network abstract layer unit</t>
        </dd>
        <dt>VPS:</dt>
        <dd>
          <t>video parameter set (H265 video specific NALU)</t>
        </dd>
        <dt>SPS:</dt>
        <dd>
          <t>sequence parameter set (H264/H265 video specific NALU)</t>
        </dd>
        <dt>PPS:</dt>
        <dd>
          <t>picture parameter set (H264/H265 video specific NALU)</t>
        </dd>
        <dt>ADTS header:</dt>
        <dd>
          <t><em>Audio Data Transport Stream Header</em></t>
        </dd>
        <dt>ASC:</dt>
        <dd>
          <t>Audio specific config</t>
        </dd>
        <dt>GOP:</dt>
        <dd>
          <t>Group of pictures, specifies the order in which intra- and inter-frames are
arranged.</t>
        </dd>
      </dl>
    </section>
    <section anchor="theory-of-operations">
      <name>Theory of Operations</name>
      <section anchor="connection-establishment">
        <name>Connection establishment</name>
        <t>In order to live stream using RUSH, the client establishes a QUIC connection
using the ALPN token "rush".</t>
        <t>After the QUIC connection is established, client creates a new bidirectional
QUIC stream, choses starting frame ID and sends <tt>Connect</tt> frame
<xref target="connect-frame"/> over that stream.  This stream is called the Connect Stream.</t>
        <t>The client sends <tt>mode of operation</tt> setting in <tt>Connect</tt> frame payload, format
of the payload is <tt>TBD</tt>.</t>
        <t>One connection SHOULD only be used to send one video.</t>
      </section>
      <section anchor="sending-video-data">
        <name>Sending Video Data</name>
        <t>The client can choose to wait for the <tt>ConnectAck</tt> frame <xref target="connect-ack-frame"/>
or it can start sending data immediately after sending the <tt>Connect</tt> frame.</t>
        <t>A track is a logical organization of the data, for example, video can have one
video track, and two audio tracks (for two languages). The client can send data
for multiple tracks simultaneously.</t>
        <t>The encoded audio or video data of each track is serialized into frames (see
<xref target="audio-frame"/> or <xref target="video-frame"/>) and transmitted from the client to the
server.  Each track has its own monotonically increasing frame ID sequence. The
client MUST start with initial frame ID = 1.</t>
        <t>Depending on mode of operation (<xref target="quic-mapping"/>), the client sends audio and
video frames on the Connect stream or on a new QUIC stream for each frame.</t>
        <t>In <tt>Multi Stream Mode</tt> (<xref target="multi-stream-mode"/>), the client can stop sending a
frame by resetting the corresponding QUIC stream. In this case, there is no
guarantee that the frame was received by the server.</t>
      </section>
      <section anchor="receiving-data">
        <name>Receiving data</name>
        <t>Upon receiving <tt>Connect</tt> frame, the server replies with <tt>ConnectAck</tt> frame
<xref target="connect-ack-frame"/> and prepares to receive audio/video data.</t>
        <t>It's possible that in <tt>Multi Stream Mode</tt> (<xref target="multi-stream-mode"/>), the server
receives audio or video data before it receives the <tt>Connect</tt> frame.  The
implementation can choose whether to buffer or drop the data.  The audio/video
data cannot be interpreted correctly before the arrival of the <tt>Connect</tt> frame.</t>
        <t>In <tt>Normal Mode</tt> (<xref target="normal-mode"/>), it is guaranteed by the transport that
frames arrive into the application layer in order they were sent.</t>
        <t>In <tt>Multi Stream Mode</tt>, it's possible that frames arrive at the application
layer in a different order than they were sent, therefore the server MUST keep
track of last received frame ID for every track that it receives. A gap in the
frame sequence ID on a given track can indicate out of order delivery and the
server MAY wait until missing frames arrive. The server must consider frame lost
if the corresponding QUIC stream was reset.</t>
        <t>Upon detecting a gap in the frame sequence, the server MAY wait for the missing
frames to arrive for an implementation defined time. If missing frames don't
arrive, the server SHOULD consider them lost and continue processing rest of the
frames. For example if the server receives the following frames for track 1: <tt>1
2 3 5 6</tt> and frame <tt>#4</tt> hasn't arrived after implementation defined timeout,
thee server SHOULD continue processing frames <tt>5</tt> and <tt>6</tt>.</t>
        <t>When the client is done streaming, it sends the <tt>End of Video</tt> frame
(<xref target="end-of-video-frame"/>) to indicate to the server that there won't be any more
data sent.</t>
      </section>
      <section anchor="reconnect">
        <name>Reconnect</name>
        <t>If the QUIC connection is closed at any point, client MAY reconnect by simply
repeat the <tt>Connection establishment</tt> process (<xref target="connection-establishment"/>) and
resume sending the same video where it left off.  In order to support
termination of the new connection by a different server, the client SHOULD
resume sending video frames starting with I-frame, to guarantee that the video
track can be decoded.</t>
        <t>Reconnect can be initiated by the server if it needs to "go away" for
maintenance. In this case, the server sends a <tt>GOAWAY</tt> frame (<xref target="goaway-frame"/>)
to advise the client to gracefully close the connection.  This allows client to
finish sending some data and establish new connection to continue sending
without interruption.</t>
      </section>
    </section>
    <section anchor="wire-format">
      <name>Wire Format</name>
      <section anchor="frame-header">
        <name>Frame Header</name>
        <t>The client and server exchange information using frames. There are different
types of frames and the payload of each frame depends on its type.</t>
        <t>Generic frame format:</t>
        <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
|Type(8)| Payload ...                                          |
+-------+------------------------------------------------------+
]]></artwork>
        <dl>
          <dt>Length(64)`:</dt>
          <dd>
            <t>Each frame starts with length field, 64 bit size that tells size of the frame
in bytes (including predefined fields, so if LENGTH is 100 bytes, then PAYLOAD
length is 100 - 8 - 8 - 1 = 82 bytes).</t>
          </dd>
          <dt>ID(64):</dt>
          <dd>
            <t>64 bit frame sequence number, every new frame MUST have a sequence ID greater
than that of the previous frame within the same track.  Track ID would be
specified in each frame. If track ID is not specified it's 0 implicitly.</t>
          </dd>
          <dt>Type(8):</dt>
          <dd>
            <t>1 byte representing the type of the frame.</t>
          </dd>
        </dl>
        <t>Predefined frame types:</t>
        <table>
          <thead>
            <tr>
              <th align="left">Frame Type</th>
              <th align="left">Frame</th>
            </tr>
          </thead>
          <tbody>
            <tr>
              <td align="left">0x0</td>
              <td align="left">connect frame</td>
            </tr>
            <tr>
              <td align="left">0x1</td>
              <td align="left">connect ack frame</td>
            </tr>
            <tr>
              <td align="left">0x2</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x3</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x4</td>
              <td align="left">end of video frame</td>
            </tr>
            <tr>
              <td align="left">0x5</td>
              <td align="left">error frame</td>
            </tr>
            <tr>
              <td align="left">0x6</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x7</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x8</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x9</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xA</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0XB</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xC</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xD</td>
              <td align="left">video frame</td>
            </tr>
            <tr>
              <td align="left">0xE</td>
              <td align="left">audio frame</td>
            </tr>
            <tr>
              <td align="left">0XF</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0X10</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x11</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x12</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x13</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x14</td>
              <td align="left">GOAWAY frame</td>
            </tr>
          </tbody>
        </table>
      </section>
      <section anchor="frames">
        <name>Frames</name>
        <section anchor="connect-frame">
          <name>Connect frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+-------+---------------+---------------+--------------+
| 0x0   |Version|Video Timescale|Audio Timescale|              |
+-------+-------+---------------+---------------+--------------+
|                    Live Session ID(64)                       |
+--------------------------------------------------------------+
| Payload ...                                                  |
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Version:</dt>
            <dd>
              <t>version of the protocol (initial version is 0x0).</t>
            </dd>
            <dt>Video Timescale:</dt>
            <dd>
              <t>timescale for all video frame timestamps on this connection. Recommended value
30000</t>
            </dd>
            <dt>Audio Timescale:</dt>
            <dd>
              <t>timescale for all audio samples timestamps on this connection, recommended
value same as audio sample rate, for example 44100</t>
            </dd>
            <dt>Live Session ID:</dt>
            <dd>
              <t>identifier of broadcast, when reconnect, client MUST use the same live session
ID</t>
            </dd>
            <dt>Payload:</dt>
            <dd>
              <t>application and version specific data that can be used by the server. OPTIONAL</t>
            </dd>
          </dl>
          <t>This frame is used by the client to initiate broadcasting. The client can start
sending other frames immediately after "Connect frame" without waiting
acknowledgement from the server.</t>
          <t>If server doesn't support VERSION sent by the client, the server sends an Error
frame with code <tt>UNSUPPORTED VERSION</tt>.</t>
          <t>If audio timescale or video timescale are 0, the server sends error frame with
error code <tt>INVALID FRAME FORMAT</tt> and closes connection.</t>
          <t>If the client receives a Connect frame from the server, the client sends an
Error frame with code <tt>TBD</tt>.</t>
        </section>
        <section anchor="connect-ack-frame">
          <name>Connect Ack frame</name>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                          17                                  |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x1   |
+-------+
]]></artwork>
          <t>The server sends the "Connect Ack" frame in response to "Connect" frame
indicating that server accepts "version" and is ready to receive data.</t>
          <t>If the client doesn't receive "Connect Ack" frame from the server within a
timeout, it will close the connection.  The timeout value is chosen by the
implementation.</t>
          <t>There can be only one "Connect Ack" frame sent over lifetime of the QUIC
connection.</t>
          <t>If the server receives a Connect Ack frame from the client, the client sends an
Error frame with code <tt>TBD</tt>.</t>
        </section>
        <section anchor="end-of-video-frame">
          <name>End of Video frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       17                                     |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x4   |
+-------+
]]></artwork>
          <t>End of Video frame is sent by a client when it's done sending data and is about
to close the connection. The server SHOULD ignore all frames sent after that.</t>
        </section>
        <section anchor="error-frame">
          <name>Error frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       29                                     |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x5   |
+-------+------------------------------------------------------+
|                   Sequence ID (64)                           |
+------------------------------+-------------------------------+
|      Error Code (32)         |
+------------------------------+
]]></artwork>
          <dl>
            <dt>Sequence ID:</dt>
            <dd>
              <t>ID of the frame sent by the client that error is generated for, ID=0x0
indicates connection level error.</t>
            </dd>
            <dt>Error Code:</dt>
            <dd>
              <t>32 bit unsigned integer</t>
            </dd>
          </dl>
          <t>Error frame can be sent by the client or the server to indicate that an error
occurred.</t>
          <t>Some errors are fatal and the connection will be closed after sending the Error
frame.</t>
        </section>
        <section anchor="video-frame">
          <name>Video frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+-------+----------------------------------------------+
| 0xD   | Codec |
+-------+-------+----------------------------------------------+
|                        PTS (64)                              |
+--------------------------------------------------------------+
|                        Track ID (64)                         |
+---------------+----------------------------------------------+
| I-Frame ID Offset | Video Data ...                           |
+---------------+----------------------------------------------+
]]></artwork>
          <dl>
            <dt>Codec:</dt>
            <dd>
              <t>specifies codec that was used to encode this frame.</t>
            </dd>
            <dt>PTS:</dt>
            <dd>
              <t>presentation timestamp in connection video timescale</t>
            </dd>
            <dt>DTS:</dt>
            <dd>
              <t>decoding timestamp in connection video timescale</t>
            </dd>
          </dl>
          <t>Supported type of codecs:</t>
          <table>
            <thead>
              <tr>
                <th align="left">Type</th>
                <th align="left">Codec</th>
              </tr>
            </thead>
            <tbody>
              <tr>
                <td align="left">0x1</td>
                <td align="left">H264</td>
              </tr>
              <tr>
                <td align="left">0x2</td>
                <td align="left">H265</td>
              </tr>
              <tr>
                <td align="left">0x3</td>
                <td align="left">VP8</td>
              </tr>
              <tr>
                <td align="left">0x4</td>
                <td align="left">VP9</td>
              </tr>
            </tbody>
          </table>
          <dl>
            <dt>Track ID:</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>I-Frame ID Offset:</dt>
            <dd>
              <t>Distance from sequence ID of the I-frame that is required before this frame
can be decoded. This can be useful to decide if frame can be dropped.</t>
            </dd>
            <dt>Video Data:</dt>
            <dd>
              <t>variable length field, that carries actual video frame data that is codec
dependent</t>
            </dd>
          </dl>
          <t>For h264/h265 codec, "Video Data" are 1 or more NALUs in AVCC format:</t>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                    NALU Length (64)                          |
+--------------------------------------------------------------+
|                    NALU Data ...
+--------------------------------------------------------------+
]]></artwork>
          <t>EVERY h264 video key-frame MUST start with SPS/PPS NALUs.
EVERY h265 video key-frame MUST start with VPS/SPS/PPS NALUs.</t>
          <t>Binary concatenation of "video data" from consecutive video frames, without data
loss MUST produce VALID h264/h265 bitstream.</t>
        </section>
        <section anchor="audio-frame">
          <name>Audio frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0xE   | Codec |
+-------+-------+----------------------------------------------+
|                      Timestamp (64)                          |
+-------+------------------------------------------------------+
|TrackID|
+-------+------------------------------------------------------+
| Audio Data ...
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Codec:</dt>
            <dd>
              <t>specifies codec that was used to encode this frame.</t>
            </dd>
          </dl>
          <t>Supported type of codecs:</t>
          <table>
            <thead>
              <tr>
                <th align="left">Type</th>
                <th align="left">Codec</th>
              </tr>
            </thead>
            <tbody>
              <tr>
                <td align="left">0x1</td>
                <td align="left">AAC</td>
              </tr>
              <tr>
                <td align="left">0x2</td>
                <td align="left">OPUS</td>
              </tr>
            </tbody>
          </table>
          <dl>
            <dt>Timestamp:</dt>
            <dd>
              <t>timestamp of first audio sample in Audio Data.</t>
            </dd>
            <dt>Track ID:</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>Audio Data:</dt>
            <dd>
              <t>variable length field, that carries 1 or more audio frames that is codec
dependent.</t>
            </dd>
          </dl>
          <t>For AAC codec, "Audio Data" are 1 or more AAC samples, prefixed with ADTS
HEADER:</t>
          <artwork><![CDATA[
152        158       ...     N
+---+---+---+---+---+---+---+...
| ADTS(56)  |  AAC SAMPLE   |
+---+---+---+---+---+---+---+...
]]></artwork>
          <t>Binary concatenation of all AAC samples in "Audio Data" from consecutive audio
frames, without data loss MUST produce VALID AAC bitstream.</t>
          <t>For OPUS codec, "Audio Data" are 1 or more OPUS samples, prefixed with OPUS
header as defined in <xref target="RFC7845"/></t>
        </section>
        <section anchor="goaway-frame">
          <name>GOAWAY frame</name>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                          17                                  |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x14  |
+-------+
]]></artwork>
          <t>The GOAWAY frame is used by the server to initiate graceful shutdown of a connection, for example, for server maintenance.</t>
          <t>Upon receiving GOAWAY, the client MUST send frames remaining in current GOP and
stop sending new frames on this connection. The client SHOULD establish a new
connection and resume sending frames there.</t>
          <t>After sending a GOAWAY frame, the server continues processing arriving frames
for an implementation defined time, after which the server SHOULD close
the connection.</t>
        </section>
      </section>
      <section anchor="quic-mapping">
        <name>QUIC Mapping</name>
        <t>One of the main goals of the RUSH protocol was ability to provide applications a
way to control reliability of delivering audio/video data. This is achieved by
using a special mode <xref target="multi-stream-mode"/>.</t>
        <section anchor="normal-mode">
          <name>Normal mode</name>
          <t>In normal mode, RUSH uses one bidirectional QUIC stream to send data and receive
data.  Using one stream guarantees reliable, in-order delivery - applications
can rely on QUIC transport layer to retransmit lost packets.  The performance
characteristics of this mode are similar to RTMP over TCP.</t>
        </section>
        <section anchor="multi-stream-mode">
          <name>Multi Stream Mode</name>
          <t>In normal mode, if packet belonging to video frame is lost, all packets sent
after it will not be delivered to application, even though those packets may
have arrived at the server. This introduces head of line blocking and can
negatively impact latency.</t>
          <t>To address this problem, RUSH defines "Multi Stream Mode", in which one QUIC
stream is used per audio/video frame.</t>
          <t>Connection establishment follows the normal procedure by client sending Connect
frame, after that Video and Audio frames are sent using following rules:</t>
          <ul spacing="normal">
            <li>Each new frame is sent on new bidirectional QUIC stream</li>
            <li>Frames within same track must have IDs that are monotonically increasing,
such that ID(n) = ID(n-1) + 1</li>
          </ul>
          <t>The receiver reconstructs the track using the frames IDs.</t>
          <t>Response Frames (Connect Ack and Error), will be in the response stream of the
stream that sent it.</t>
          <t>The client MAY control delivery reliability by setting a delivery timer for
every audio or video frame and reset the QUIC stream when the timer fires.  This
will effectively stop retransmissions if the frame wasn't fully delivered in
time.</t>
          <t>Timeout is implementation defined, however future versions of the draft will
define a way to negotiate it.</t>
        </section>
      </section>
    </section>
    <section anchor="error-handling">
      <name>Error Handling</name>
      <t>An endpoint that detects an error SHOULD signal the existence of that error to
its peer.  Errors can affect an entire connection (see <xref target="connection-errors"/>),
or a single frame (see <xref target="frame-errors"/>).</t>
      <t>The most appropriate error code SHOULD be included in the error frame that
signals the error.</t>
      <section anchor="connection-errors">
        <name>Connection Errors</name>
        <t>There is one error code defined in core of the protocol that indicates
connection error:</t>
        <t>1 - UNSUPPORTED VERSION - indicates that the server doesn't support version
specified in Connect frame</t>
      </section>
      <section anchor="frame-errors">
        <name>Frame errors</name>
        <t>There are two error codes defined in core protocol that indicate a problem with
a particular frame:</t>
        <t>2 - UNSUPPORTED CODEC - indicates that the server doesn't support the given
audio or video codec</t>
        <t>3 - INVALID FRAME FORMAT - indicates that the receiver was not able to parse
the frame or there was an issue with a field's value.</t>
      </section>
    </section>
    <section anchor="extensions">
      <name>Extensions</name>
      <t>RUSH permits extension of the protocol.</t>
      <t>Extensions are permitted to use new frame types (<xref target="wire-format"/>), new error
codes (<xref target="error-frame"/>), or new audio and video codecs (<xref target="audio-frame"/>,
<xref target="video-frame"/>).</t>
      <t>Implementations MUST ignore unknown or unsupported values in all extensible
protocol elements, except <tt>codec id</tt>, which returns an UNSUPPORTED CODEC error.
Implementations MUST discard frames that have unknown or unsupported types.</t>
    </section>
    <section anchor="security-considerations">
      <name>Security Considerations</name>
      <t>RUSH protocol relies on security guarantees provided by the transport.</t>
      <t>Implementation SHOULD be prepare to handle cases when sender deliberately sends
frames with gaps in sequence IDs.</t>
      <t>Implementation SHOULD be prepare to handle cases when server never receives
Connect frame (<xref target="connect-frame"/>).</t>
      <t>A frame parser MUST ensure that value of frame length field (see
<xref target="frame-header"/>) matches actual length of the frame, including the frame
header.</t>
      <t>Implementation SHOULD be prepare to handle cases when sender sends a frame with
large frame length field value.</t>
    </section>
    <section anchor="iana-considerations">
      <name>IANA Considerations</name>
      <t>TODO: add frame type registry, error code registry, audio/video codecs
registry</t>
    </section>
  </middle>
  <back>
    <references>
      <name>Normative References</name>
      <reference anchor="RFC2119">
        <front>
          <title>Key words for use in RFCs to Indicate Requirement Levels</title>
          <author fullname="S. Bradner" initials="S." surname="Bradner">
            <organization/>
          </author>
          <date month="March" year="1997"/>
          <abstract>
            <t>In many standards track documents several words are used to signify the requirements in the specification.  These words are often capitalized. This document defines these words as they should be interpreted in IETF documents.  This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
          </abstract>
        </front>
        <seriesInfo name="BCP" value="14"/>
        <seriesInfo name="RFC" value="2119"/>
        <seriesInfo name="DOI" value="10.17487/RFC2119"/>
      </reference>
      <reference anchor="RFC8174">
        <front>
          <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
          <author fullname="B. Leiba" initials="B." surname="Leiba">
            <organization/>
          </author>
          <date month="May" year="2017"/>
          <abstract>
            <t>RFC 2119 specifies common key words that may be used in protocol  specifications.  This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the  defined special meanings.</t>
          </abstract>
        </front>
        <seriesInfo name="BCP" value="14"/>
        <seriesInfo name="RFC" value="8174"/>
        <seriesInfo name="DOI" value="10.17487/RFC8174"/>
      </reference>
      <reference anchor="RFC7845">
        <front>
          <title>Ogg Encapsulation for the Opus Audio Codec</title>
          <author fullname="T. Terriberry" initials="T." surname="Terriberry">
            <organization/>
          </author>
          <author fullname="R. Lee" initials="R." surname="Lee">
            <organization/>
          </author>
          <author fullname="R. Giles" initials="R." surname="Giles">
            <organization/>
          </author>
          <date month="April" year="2016"/>
          <abstract>
            <t>This document defines the Ogg encapsulation for the Opus interactive speech and audio codec.  This allows data encoded in the Opus format to be stored in an Ogg logical bitstream.</t>
          </abstract>
        </front>
        <seriesInfo name="RFC" value="7845"/>
        <seriesInfo name="DOI" value="10.17487/RFC7845"/>
      </reference>
    </references>
    <section numbered="false" anchor="acknowledgments">
      <name>Acknowledgments</name>
      <t>This draft is the work of many people: Vlad Shubin, Nitin Garg, Milen Lazarov,
Benny Luo, Nick Ruff, Konstantin Tsoy, Nick Wu.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAGFHJmIAA+08a1MbR7bf+1f0VT4EEokgDI5DVaquAtgmAaOLsL2pra2l
NWpJvYxmlHlAiHF++z2vnukZCeP4kUrtrqoopJnp7tOnz/sxvV5PFa6I7b7u
nL8cPdc9fW5jZ8ax1Rtlksn3TZ0XmTULl8z0MkuLNErjjjLjcWavYWRW5vOO
mqRRYhYw0yQz06J3tSxnLunhvd52X0WmsLM0u93XLpmmSrlltq8LuFvsbG9/
t72jDCywr5/ZxGYmVjdpdjXL0nK5ry/ODs/0a/iNqz/Da+rK3sIDk319nBQ2
S2zRO8Q1lcoLk0z+aeI0AThuba7yhcmKf/5SpoXN93WSqqXb13+HDXR1nmaw
qWkO324X+OUfSpmymKfZvtI9peHjEhj005Ye4lboCu/wJ5e5OA4up9nMJO43
U7g02ddPTWTHaXpFt+zCuBh2feWy/52Ot6J00Zh9sKWfZi6Z2DgOFhjEJmle
f3gFM8Xn163x45Y+sMlvBrZfL/EjINA1rj+8xL8ifvyeRV5blwPCQ0z9aK5s
8/p7rAJjbmSIX0klabaAMdcWDkefPz3Y6fe/2wcyAmKqbsCn1+tpMwZqNRGQ
A5G0yzUg0yyXsYto1V5sr21cEbKGGWALM5sXSGExTKWv3cSmW+piDoOBrsuF
TQo9sXmUubHNdTG39XCgOD1Pb7Qr9MIsc50mRar/7+XxwRaDs3CTSWyV+gKp
NUsnZYRABMDpsZu4zNJlE4eQ6hakAIGbJXZCINeAKoEeBhRzU+isTBAMXaRL
nU49LLTejcn1uHRxoQ2unNllDCdA28M5zy9Oh3rj3Jq4d+EWVp/aPDczRMtQ
YNjUN66YIwbULAVoYa8AHoKh83K5BJ6iiRJ7o005cSmhh6DUUTqxEbCb/bWw
Se7GLnbFLWCeJsNjRGBx4IJWtbq4XVp4HmdYlHHhenisV36dLcCnNpOJw413
NW1vBjjJVYBBwMOSEYPrI/4Bi6YwgErEX3arZ6XJDIgRONbxrS4LgOo33DBi
TbHYywF77yIFWtofUhcQlFnNZMnA41RIG0uYdwsJ4SBNrmEagg8fOLRTl9A+
clzJahBwGiVcrjunL0cXnS7/1y/O6Pv5EUx5fnSI30fPBycn1Rf/xOj52csT
uK/kWz3y4Oz09OjFIQ+Gq7p16XTwc4fB7pwNL47PXgxOOnxILlcVAkBaI0rH
Fm6BCF5mtgCyBJrymJngmB8Ohrq/q9+8EYZ9+xa+/w/8eNL/dvftW3Uztwmv
lSYx7Jh+AkZvkQmsQbbUBgRtZJauMDHSQq5zYLZEz21mAZdPMxAz3zCZogTY
13E6g7OPdQkIRYKqJITnjih2tANYNbcZ0ABMnwBRRnMDbKTU8GJEE8GecnhQ
RgI3gHJZLEHGHMoDQM3pBEkluDkYHNA9M7k2SYQoISYg0lcKcPmSboPKQg1X
iSodm1sABGFW6tWQ52euWRrcIuAYoC30xvOdx3tyJ1/ayE1dpHHaTaVGMi63
v5QWFl8zdPebd4wfyvili4oy+8PDB4AXOBYzsRlN89WAtn6I3HYBLJaTcBgR
R+nn9NxXMGrECOOHqzmBV6duptSzsyHdJr2PxynAodbmZ4UHgVks0cvN3EVz
pMrM9OiQiUB7U9xLjnQLlkaGJz0B8gFeBHYDswSnPluC6SFM+AUxacJCWePp
jmOXz5H2lQK5w8sBB5AUZjGhyxypAaUBUbGntGo0rs/CIKrmVjwIHx+cDF/A
lFc2EZMKABxMEf94tzUONUc98aTrF4sAkoIWQkna0CyKpmBY4fl5msNzMEVG
eo8QpI8PhTESkD2XgoJLvqnevJH1GZvAzek1QQdcxdNuaU2CUhAC34AVY2AD
3IHMJiSwxZJOwJYFF8AneBSpP4pLpD2CD462BQ8Q6G2cGtg7c7iCgaSZ+TKu
fnnxw+ElrHSW2BB3IhNJ6IAIK3OEMCUg4GKl/pEKRnANl39FNI/E3IAbRQdg
ElCJE9wYx+oPwfDQDqIrD3CNQNBjHokKzQ+eiQ6DwMAlSU+5xcJOHBwpgGqm
zIx8O1xDFkCC0awkybDwsjC0ubRgCWcnzIHoA9EV265X0wDI3ABVAyYUX6Ip
WVCD3BKZRhdzvUH7hatgsoIuBUNkc0u3MESIxQVRzbMuhwX9DLnDKyaxaZnH
t0IXIMCAFrz8hFEMCeEEdmANMHm1U5DjzoDeJq0D5yDMvpFbJFqaoibZDI6B
JvOXNnljKKIWrkA9Ns3SRcjBMCXaKKwugMaP6tXnoJFcAWYG6KRFmoANkCDK
YzRskBfzBm950UwYUjI7aXY+erKtyBSAU6tGfa/7gJVDu5STT3GpFqPojTdv
fild1BM7A7bVEELMYJVJJgcriCLVWDOocC9gCm6wIAlEB9MMYsDTHMjDy1M8
VC/dTwG6S4SIzTYe10OY22Ax1YOd6qkaSIS2DbYY6l/mfRqQZnBhmfJjATxk
B6J1ApPllmYH5QU/wa2pjDsWUjgPT492MAhGC9J7gmvhHTleYvtzuue5UKmX
sLAMwIstvusG48mmRq1EZ7kqBNRaIUAkCBYH6FxUaKkHjk/sm5r4Ed3Fl7le
pjmY0LFszH3ACTC4ShbK13La2MJhWxRP1WPrxI4mcnYoRRaVyRSIRjDs8FTI
YiynU/gGy0wyOHYviXiKcLeKAIBJgKfadibRQlSQ9CYAcRpQ6+4ahd30HtGI
ZPoCNUVco4fcyrhGDOwUKKcim4o2isqAQXyryprI8IxI5hAIoetGFp2rTAW0
am+QMtGovJdpEIKV022uJnQcrKWqtcCvcYhfZC2/rklaiwuLVIgTuiUxdGXt
UrFkAzzGJi9qNqnkEQkAcp74SSbBmka29EDPzLJy7WhcZZUeH7JYQV8tkRmQ
VhxwNkaJdFqS4c7wV34ayehKCGtwU1jdluBHxeBk57Wk9ahiPSQDFiXsBTgv
dzgtwxSneaHc9N3yRWQFyKItkQMTIMKIBJMJ9qmb+2zIhApabxwIvJ6QgH7k
dPEBREaTlyboIKKR4pDbjqft/U7S5MtC8RSNhcXSqfYNtxa0bcInesQuKSmY
EVmeEXZaCA8JdFv6aW0laEFXJewCsTBN4zi9CcCi3dIB9/f1ZV/t6Ed6Tz++
pMUZXZdf7F6iEgX4BQUTsXPegQKgkK6CFddsc2U/AsrlHq96+Rjtwddzm4Rq
iHz7xNZhTpIErDNJmByhZThlK9ALchAf8EAvnfZa5gScZkXMIhkETq+FgPNu
8MhQsJnkFtQ5+CUk8EQ6sApiCQayYnqfCxDBWSLGCppmmTpkb29XANFlfhIU
ZDmi9BYkPvjWRUNIrjg5lx6FKCTrJXuNh8R0ggnzkii/NkxzPFtWJDesj8HH
tVOkrCnI+tCDknCOgjMHzDdsVLQ8gv3CFkIBxzhtmBNMB22AGrZO5fCQfj7u
eQWe6jXGAmuiWkbBcZHXT85jdUD+FhtuRduiQJaB7SegTojTOzPg9htz20H+
UAuDmi0xZBWuWDJ+CjHf9OWzs8Hrwc/eoYDDmaU4V0V8CkXJ5NrltmXAzmAT
dlqiZUpEI0LPI9d7bgZZOK/HKYxM5fMKl3m6YI1N3FSRQ/uoJNxGzChDFSIc
ZTtp8qykuByFw15jwOwp+3BI+RTUkRBBw9sKYjY+XtOI75QBx5PwzyxFqiqa
URRSRPLyioK1SuUzeteC0Tshk5vsY7TxcTAATDkKF8kzvPq+Ur///rva1vzp
y/8d+f9I/u/K/z35/1j+f6u+7n3U52t1p9d/TmwyA0LfeLy7ec8T9Ln7fBCA
wn9o9SYEHwgJQHABJ7TxZPNOD+U8t7a2Hlr3k0KARKAY5bjny321z96iGAco
fMQ1iPlgps7Gk65+vKvHqHLAiRXxY+M4598iDVnrOJSDGOHZAAczLieck7Ne
PdJ0GB1LUeycHL14dkFZhv72No8jsZLo4eDnk7PBoRIo5ImefiJ/ffA6n+zw
kE00Vw9xP7gdgbRl1SXlYozCmO1ClAb8AFmVFE4wDRNwRnGqTIl9arzNgXu5
dmmZe0cNUCX2FekUksUorEgmw0Q3aRmDwAXDUOKBFHkO/FM0lwr/NLmFhQ6e
RXN7m4wNF7mC4w9MRbjZPmEAfTqOBnv9hqKgcS4wbBicAsFOwgYkw51INJxX
+x936i4kHf8DLuvtX7fhMa9caC6+3A8u44amfia4twP3EMYMLSi+9Gj10i5c
smzLBFpRbu7hzSxLs8bVx6uzfLt66cnqpe9WLw1WLv3th9WnDlYvHcKlVYCP
4Cq7rcHVvz1dXaS/vTplv7/m2hos9tegsY94ZFVcLV1pLgofV/FjYVuSDP8V
8qsitg3RA7+/Fv6AqV7ZLAetf8ehWcxX5pGJ7R2nEurfnwGCdUeA3tsInQ6w
Flhafs5T+CAF9wkhIEUnJ4CC8pq/1lJcUtYbPprpHwAJDOeHKqV1bjhL4X+w
HxzHDaav8mwSsURDOTBg0R5fgF+CYeNrE5dWPdqGj1Itgli/EMuRnJzc/N1L
dcmtkqUULcXKyeSNaXQGKq4RY9e7u32EqEUsCBHsE9QL6KQMcTjO4HDBCwBv
DrOitR9Xe3eoWEux42l1TkXxpKCvQSExjeDsYWSKMvJyGlXCjUx6zo+yM0NJ
kWZkVPuEsKTD+VTgS/hs7XF4d6jeDOa/V3IDaBIp716kFCcU43w199FpCNWO
9i4FxlbQxQClmKQ3sZ3MpKbBB/Kr2C4YA+JBTFJLUQdftvDq6HwEuyMPvLmZ
dZ5Yoo9QU6raSqEkr758+WL0cjg8O784OvRTXvK6kjapKK8KtdaX0FnZXrNc
qJVxLcUXeMXjF68GJyB/n54PTo/007Pz08EFBzvI0WswSRVLkBOoo79NfdXG
3LpcQqKOWmAJQJJ0C5XgwFsrf1k/CaH59k8RnX8BP4lsycZULNAv2oSHp94J
DrHjmR5FEkZLOfHpH+lUTgqFv9haNj5UAzZrZJfg/3RE/HQ4QY/hVTO5DdMe
PtXRIFbPsf6hdYC16Nb7D0b5uCGGYm6weu/eKIj1MUbWIhRmw2R5IlKhlejg
hCUwrghOyipjMHEdeCRcKGseu6nFdbzGpHqjdZzajrSaVaZqJyw/lFvDKOef
Yre+D8f9OzHd7jqmW8U657RZDRl/jmQHkLvKoeqwSkDYyIyBajECuJ62A+6W
gLmbJZgFQvvHh0cpziZVJ6bwhFGTzmemiJ3v/rMoYu9TTbX6GQXhlgd29CA6
HwKsgoAp5QAly8ajnc0/sAIzQwA02q2YLAyiLGuMM1YwbA9h6pYKyAuuke3C
+O/B2/D6qGELSXEtjQQyrwHHdR/tUJCrTKTgFmPWM4xGh0JUBP4amCTL57M+
YT5oTrkaXlalUVRmGWUTRhhYp6tUpaanwNdxFZ8OwCblNbZV6melKCgwTKnE
Dfj3z5Po/0aRiPeAAANTMBWRTfRpprznM7wYvce2PiNa64jrO8FYheADcHDc
e+prDc6mU6w+vQsK8B6IeXwCCEgS0ZmiLKgLTamMl1kYiwJ83SDXqnGUoAoG
Y5HwfUXEaD4HHN3yAqXCeF2B8UMDR+zLIlgSoOaiewpCS/jZ06pEnIPAMzgF
dxrLfOn7Dn3fo++PAP/DJ/R1F79+dweCxRNEU04HJSFFI0iAPQ8rB4tjDx22
7URixTbqRHhKSdJKnQl6C7+ULsOQgy9i8euoVpKWc5p1RGNaUscC3MaGBTdt
CnKsSFpyXXBNbRTcMhn3RDVzNhIuyTKkDRMVpWnGq+qYihPaUZxRpCpiLKqY
Y031HGuq6XZXd+qFO6QG+qhLsDyAyqxzJIDBq4ODv3baEUF9P03wuQQWQeCF
xSeKdh69Ojr/mY5MTvnKSt59pYRzNBx9MxyO+My26pF7D458BSNbo9UPLjHZ
LTI+mhB1iUSnrtTrMPNgnY+NyqJqDRKzvltFyqicESyHnFdeUkOS1RxBqskR
jJ/cl2mTBTGo0yz/tSDaEHyUB3Ck/wwL4qJSIe/LkB+RhUcVcHz4SfATNJF8
Ok7+ONX+oJKV0/Qq1itaUbCDwYHXr2fDlyPQpNXZVJkJOiisUnEZluqFOQVU
ABVOtj5IC9fj31e51VooSLfm9+m2LVZusNFKrdVLttUaPiVZly6aS1P3K2CW
ZCF2FKnnR4PDo3NRdP09r9p0f++JfPPm4Auijnv/kHzuaM6NvcebyHO09mhw
Ojw5qij/ncOJeO6TxhhECTaDB9XY9oqAJlSqdQJa3yegcf5QNCOakYjeA8/0
2D2IxnuKW7e4h4+rGGAH3K737ZPdvbdvRROEae+/pvkh0PxHRfN374nmN4oU
Wqm6MDwhqTpfJajzeVlMsKkFKbuR+2y0DOEPX2AdFDKuNEwwGI24NFs+1lcC
o1mPU0ivF8VF4KlnZ0OqMW00iVTlResTwUGGUeKcdZ0itbMEcXaKsLTqRSvp
xj2mg0aExTRQ2sjV+YrHPKw/pqLmelb1cIF3V2I63Mm4ppgbAz+qFdyl8hMq
ED7lBiBuehN1gIjV2K+d+yuNnmVSeUYasoOO7kYjtVE35jbso+a3Q/AgmFUq
9WnP7aYV9sMwQh3NneWeG+l8NKyEwXWinqa1HSsShZbmDbxG/RNJ/Vv6v8uc
SMK2WurDan7f51fFzSWxoqQN5WXOLVZVV2fQIO7fh9EFCu21+hN6DWyRI5pZ
ygTx8nUHCXdrULLLN51xSf4SdLYtcklCLW1GTh7wk4rmBtuEAbt54SI5REAn
oQzlfO4WLjY0K/XwU4rp4mAomFtpNFnFHzjDvD54wnGaUNM/zHbdzEognF1S
dQIsRT+VVOxLZk06dgQ1bEgFyKEiQWTbtJwhfWOqws+2MLeKqwV9L0DRqEBg
OpL3KMCJoNKiThWHhx6nEb0lhHLfJlGJndGrIbAbb7HkPmsQUREV+WGd9CTD
EndCJdA8nOxCCIn5Edvv25jrdOsuY6SS4D0BlXxdoh4NeMBbjvdV20vnBKdb
5VRIhEywD3t8G6bycH8yjxIRVKdrJEqG+x+EhpqR1h9fIV01amRlTCWKX3GV
al236TNQAOpKI3HITzCSi958prWu0uSWGzrN40MxFhGQ+9okuyovSeDBc8eH
G8mm/p7+9/qb+mvdZ30m3JpxRQzAUEZFHpi8dTu1bB2Wplp9SVYLsBthDhXR
RUHzzW4VXZea0yrJ7XsiuTHGyxLObWMDSdHsZ8bei+p9E15EhAIT2zGkuzF4
GQWK/4zaAriOttWTx0cjGssWdVOI71TybS0yDxxZLoX9ijZmp1M8ROII0qiV
CKKqodx391RNkphu56aBmptdQsn0LXZcqKI/v0efdfHNKLgVmITeKSCJ/0oN
0buCCOmKhwA2RM0A86ZslBByfRLyOWw/Jv02SLCUlXpe+CS4Lyuv0iteYWIO
B19Vgm3Fv4IEpfAiAVBljopUYYX/0nJ/L6dhUIYbwhhNCco9a2RisL9YN9tj
aCB2EmJXN6g2ADT26JTH6Uf9pNDNgnqylsD1y4w2HVT4yDaIKrHkm01z2k6Q
jaLGRN5qXt/k+E34MgPenC9acKwvg9UC6z9Cx6Fd0ScNp5JQC00pmgSESR+U
4ZoyKLha5+GqFpt7qrGEUJpV3a2S2rpXxDb2RC8nuUmDTeUru1q/HTgy0QNc
ZmXwJRigcUvUrbQqbG+ntb2Ds8Ojgz+0ObxF7Y+qxeDynpBHMNu6qq71i1Qi
Ea041L3kzRf06hAxFJlCOCuZcfcz2qB5XkopiGG3/8uca16Y3/g9PfQ+DDYX
sUMLuMT6G23awDRqNYiOgYcUbANgtWKtYbgVZ+PNG3xfTo/j2dSDi49wepSP
Dvvs8GfV49TV73zDEA5o9Px3VbvjH6trGgJLvG2phygTLCPEHjXMAFfhHkJN
7l9K419jFFtVEZPlKeklR1jqpC85ruQml12xGUDilhm992cNEQnLroVt4vLI
ZJNG7IWU6z3QEn7pIEcWvClUOgfSBurfctJ0AlA9sUeV+wGB7Ss+wWpH9Aou
A3klHe14+HOU25Y623JWVGjNiAU9ppw9KiUsVvJdsUSZM3ynFtoVdV4o/4gl
iR8TG1ZUqYZYCRoeQ3IZVO8cAZ6SZmk4/jKTtBSXivnGskYszb+HggU/h1mw
dRLoPZrXmSMZEpY7dHXd41N3APEMH4t231EYFJWCkJvZdRvwMgHfYTZ4MVih
I3xFH77zKGx8AfTOQNdmt91QudQXQ/uY+Vb5m/5FbmMw6HDNQVXVS8yl3uxz
t5GdfN+Zgraznbf+BV1kTjjWf/SGJUDnglpjbbrE1xy+isFhGM3LsQM/5AVW
DetnsO2uPnWwZX1ifjNA6F31g01g1EmZ4lNgI56X02lX/4Qmp0EvX1/k6a3c
e11uqf8HSn8raUpRAAA=

-->

</rfc>
