Sending long non-ASCII characters in a request header causes a parsing error
I'm currently working on a project that requires sending an encoded API key and its initialization vector (IV) from a client-side application to a server. However, I've run into an issue with the size limit for HTTP protocol headers. The HTTP protocol only allows a limited set of ASCII characters in the headers, which means that non-ASCII characters, such as a sequence of binary data, can cause parsing errors.
To illustrate this, here's a sequence of binary data that I attempted to send:
\xFE4\xB2\xD9\b4\xD4T0\x99iNJ\xBEQ"&~\x9D\x9E\xDF\xB7\x8Dtr|\x990\x04\xFC\xC5\xD2FL\xD6\xE9\xAA\xD3
This sequence of binary data was created by the OpenSSL::Cipher.new('aes-256-gcm') algorithm and unfortunately, it's too long to be sent over HTTP headers. To work around this, I decided to use Base64 encoding.
Base64 is a scheme that can transform binary data to text. Since the HTTP protocol expects ASCII characters to be used, Base64 converts binary data into a set of 64 printable ASCII characters that can be transmitted or stored as text.
Once the text data reaches the server side, it is decoded by the Base64 scheme and becomes binary data again. This binary data is then deciphered using the OpenSSL::Cipher.new('aes-256-gcm') algorithm, with the API key and IV.
Overall, to transfer confidential data over the HTTP protocol, it needs to be encoded twice. First, it's encrypted using the AES-256 algorithm, and then the encrypted data is encoded into text using the Base64 algorithm.