In a data science or machine learning project, you may prepare and study images or other data within a Jupyter notebook then need to annotate the data to augment the training or fix errors in your source data.
If you’re working with Jupyter Notebook extensions (nbextensions) you may have used commands such as
jupyter nbextension enable <extension name> to make them work or to stop them from loading temporarily.
I was trying to get Tensorflow 1.8 to run under Google Cloud ML - potentially to be distributed, but more importantly just to use a faster processor than my laptop. A complication is that I needed my input data to be stored on Google Cloud Storage. For a single machine, it would probably be fine just to make sure I load the file directly into memory from Cloud Storage, but especially for use in distributed settings, Tensorflow 1.8 ideally sees you use their new Datasets API to load the data. This gets quite complicated, especially if you need to do processing on the raw input CSV data.
As a way to learn Tensorflow and Deep Neural Networks, I made a DNN model to play Noughts and Crosses. It’s interesting from a technical point of view, but ultimately is not the best way to solve the game… search would be a better algorithm, and if you do want to use DNNs, ideally you would use Reinforcement Learning techniques too. The game play model is probably the most reusable part.