Piping Responses
- standard error, where each element is a line of text,
- data rows, where the first element consists of fields name/types, and each of the rest is a row of data,
To create good user experience, we need to pipe these responses from SQLFlow jobs to Jupyter Notebook in real-time.
In the above figure, from the SQLFlow magic command to the bottom layer are our work.
The above figure shows that there are multiple streams between the Jupyter Notebook server and Jupyter kernels. According to the document, there are five: Shell, IOPub, stdin, Control, and Heartbeat. These streams are ZeroMQ streams. We don’t use ZeroMQ, but we can take the idea of having multiple parallel streams in the pipe.
Another idea is multiplexing all streams into one. For example, we can have only one ZeroMQ stream, where each element is a polymorphic type – could be a text string or a data row.