Testing IPython for users and developers
Attention
This is copied verbatim from the old IPython wiki and is currently under development. Much of the information in this part of the development guide is out of date.
Overview
It is extremely important that all code contributed to IPython has
tests. Tests should be written as unittests, doctests or other entities
that the IPython test system can detect. See below for more details on
this.
Each subpackage in IPython should have its own tests
directory that
contains all of the tests for that subpackage. All of the files in the
tests
directory should have the word “tests” in them to enable the
testing framework to find them.
In docstrings, examples (either using IPython prompts like In [1]:
or ‘classic’ python >>>
ones) can and should be included. The
testing system will detect them as doctests and will run them; it offers
control to skip parts or all of a specific doctest if the example is
meant to be informative but shows non-reproducible information (like
filesystem data).
If a subpackage has any dependencies beyond the Python standard library,
the tests for that subpackage should be skipped if the dependencies are
not found. This is very important so users don’t get tests failing
simply because they don’t have dependencies.
The testing system we use is an extension of the
nose test runner. In
particular we’ve developed a nose plugin that allows us to paste
verbatim IPython sessions and test them as doctests, which is extremely
important for us.
Running the test suite
You can run IPython from the source download directory without even
installing it system-wide or having configure anything, by typing at the
terminal:
python2 -c "import IPython; IPython.start_ipython();"
To start the webbased notebook you can use:
python2 -c "import IPython; IPython.start_ipython(['notebook']);"
In order to run the test suite, you must at least be able to import
IPython, even if you haven’t fully installed the user-facing scripts yet
(common in a development environment). You can then run the tests with:
python -c "import IPython; IPython.test()"
Once you have installed IPython either via a full install or using:
you will have available a system-wide script called iptest
that runs
the full test suite. You can then run the suite with:
By default, this excludes the relatively slow tests for
IPython.parallel
. To run these, use iptest --all
.
Please note that the iptest tool will run tests against the code
imported by the Python interpreter. If the command
python setup.py symlink
has been previously run then this will
always be the test code in the local directory via a symlink. However,
if this command has not been run for the version of Python being tested,
there is the possibility that iptest will run the tests against an
installed version of IPython.
Regardless of how you run things, you should eventually see something
like:
**********************************************************************
Test suite completed for system with the following information:
{'commit_hash': '144fdae',
'commit_source': 'repository',
'ipython_path': '/home/fperez/usr/lib/python2.6/site-packages/IPython',
'ipython_version': '0.11.dev',
'os_name': 'posix',
'platform': 'Linux-2.6.35-22-generic-i686-with-Ubuntu-10.10-maverick',
'sys_executable': '/usr/bin/python',
'sys_platform': 'linux2',
'sys_version': '2.6.6 (r266:84292, Sep 15 2010, 15:52:39) \n[GCC 4.4.5]'}
Tools and libraries available at test time:
curses matplotlib pymongo qt sqlite3 tornado wx wx.aui zmq
Ran 9 test groups in 67.213s
Status:
OK
If not, there will be a message indicating which test group failed and
how to rerun that group individually. For example, this tests the
IPython.utils
subpackage, the -v
option shows progress
indicators:
$ iptest IPython.utils -- -v
..........................SS..SSS............................S.S...
.........................................................
----------------------------------------------------------------------
Ran 125 tests in 0.119s
OK (SKIP=7)
Because the IPython test machinery is based on nose, you can use all
nose syntax. Options after --
are passed to nose. For example, this
lets you run the specific test test_rehashx
inside the
test_magic
module:
$ iptest IPython.core.tests.test_magic:test_rehashx -- -vv
IPython.core.tests.test_magic.test_rehashx(True,) ... ok
IPython.core.tests.test_magic.test_rehashx(True,) ... ok
----------------------------------------------------------------------
Ran 2 tests in 0.100s
OK
When developing, the --pdb
and --pdb-failures
of nose are
particularly useful, these drop you into an interactive pdb session at
the point of the error or failure respectively:
iptest mymodule -- --pdb
.
The system information summary printed above is accessible from the top
level package. If you encounter a problem with IPython, it’s useful to
include this information when reporting on the mailing list; use:
from IPython import sys_info
print sys_info()
and include the resulting information in your query.
Testing pull requests
We have a script that fetches a pull request from Github, merges it with
master, and runs the test suite on different versions of Python. This
uses a separate copy of the repository, so you can keep working on the
code while it runs. To run it:
python tools/test_pr.py -p 1234
The number is the pull request number from Github; the -p
flag makes
it post the results to a comment on the pull request. Any further
arguments are passed to iptest
.
This requires the requests
and keyring packages.
For developers: writing tests
By now IPython has a reasonable test suite, so the best way to see
what’s available is to look at the tests
directory in most
subpackages. But here are a few pointers to make the process easier.
Main tools: IPython.testing
The IPython.testing
package is where all of the machinery to test
IPython (rather than the tests for its various parts) lives. In
particular, the iptest
module in there has all the smarts to control
the test process. In there, the make_exclude
function is used to
build a blacklist of exclusions, these are modules that do not get even
imported for tests. This is important so that things that would fail to
even import because of missing dependencies don’t give errors to end
users, as we stated above.
The decorators
module contains a lot of useful decorators,
especially useful to mark individual tests that should be skipped under
certain conditions (rather than blacklisting the package altogether
because of a missing major dependency).
Our nose plugin for doctests
The plugin
subpackage in testing contains a nose plugin called
ipdoctest
that teaches nose about IPython syntax, so you can write
doctests with IPython prompts. You can also mark doctest output with
# random
for the output corresponding to a single input to be
ignored (stronger than using ellipsis and useful to keep it as an
example). If you want the entire docstring to be executed but none of
the output from any input to be checked, you can use the
# all-random
marker. The IPython.testing.plugin.dtexample
module
contains examples of how to use these; for reference here is how to use
# random
:
def ranfunc():
"""A function with some random output.
Normal examples are verified as usual:
>>> 1+3
4
But if you put '# random' in the output, it is ignored:
>>> 1+3
junk goes here... # random
>>> 1+2
again, anything goes #random
if multiline, the random mark is only needed once.
>>> 1+2
You can also put the random marker at the end:
# random
>>> 1+2
# random
.. or at the beginning.
More correct input is properly verified:
>>> ranfunc()
'ranfunc'
"""
return 'ranfunc'
and an example of # all-random
:
def random_all():
"""A function where we ignore the output of ALL examples.
Examples:
# all-random
This mark tells the testing machinery that all subsequent examples
should be treated as random (ignoring their output). They are still
executed, so if a they raise an error, it will be detected as such,
but their output is completely ignored.
>>> 1+3
junk goes here...
>>> 1+3
klasdfj;
In [8]: print 'hello'
world # random
In [9]: iprand()
Out[9]: 'iprand'
"""
return 'iprand'
When writing docstrings, you can use the @skip_doctest
decorator to
indicate that a docstring should not be treated as a doctest at all.
The difference between # all-random
and @skip_doctest
is that
the former executes the example but ignores output, while the latter
doesn’t execute any code. @skip_doctest
should be used for
docstrings whose examples are purely informational.
If a given docstring fails under certain conditions but otherwise is a
good doctest, you can use code like the following, that relies on the
‘null’ decorator to leave the docstring intact where it works as a test:
# The docstring for full_path doctests differently on win32 (different path
# separator) so just skip the doctest there, and use a null decorator
# elsewhere:
doctest_deco = dec.skip_doctest if sys.platform == 'win32' else dec.null_deco
@doctest_deco
def full_path(startPath,files):
"""Make full paths for all the listed files, based on startPath..."""
# function body follows...
With our nose plugin that understands IPython syntax, an extremely
effective way to write tests is to simply copy and paste an interactive
session into a docstring. You can writing this type of test, where your
docstring is meant only as a test, by prefixing the function name with
doctest_
and leaving its body absolutely empty other than the
docstring. In IPython.core.tests.test_magic
you can find several
examples of this, but for completeness sake, your code should look like
this (a simple case):
def doctest_time():
"""
In [10]: %time None
CPU times: user 0.00 s, sys: 0.00 s, total: 0.00 s
Wall time: 0.00 s
"""
This function is only analyzed for its docstring but it is not
considered a separate test, which is why its body should be empty.
JavaScript Tests
We currently use casperjs for testing the
notebook javascript user interface.
To run the JS test suite by itself, you can either use iptest js
,
which will start up a new notebook server and test against it, or you
can open up a notebook server yourself, and then:
cd IPython/html/tests/casperjs;
casperjs test --includes=util.js test_cases
If your testing notebook server uses something other than the default
port (8888), you will have to pass that as a parameter to the test suite
as well.
casperjs test --includes=util.js --port=8889 test_cases
Running individual tests
To speed up development, you usually are working on getting one test
passing at a time. To do this, just pass the filename directly to the
casperjs test
command like so:
casperjs test --includes=util.js test_cases/execute_code_cell.js
Wrapping your head around the javascript within javascript:
CasperJS is a browser that’s written in javascript, so we write
javascript code to drive it. The Casper browser itself also has a
javascript implementation (like the ones that come with Firefox and
Chrome), and in the test suite we get access to those using
this.evaluate
, and it’s cousins (this.theEvaluate
, etc).
Additionally, because of the asynchronous / callback nature of
everything, there are plenty of this.then
calls which define steps
in test suite. Part of the reason for this is that each step has a
timeout (default of 5 or 10 seconds). Additionally, there are already
convenience functions in util.js
to help you wait for output in a
given cell, etc. In our javascript tests, if you see functions which
look_like_pep8_naming_convention
, those are probably coming from
util.js
, whereas functions that come with casper
haveCamelCaseNamingConvention
Each file in test_cases
looks something like this (this is
test_cases/check_interrupt.js
):
casper.notebook_test(function () {
this.evaluate(function () {
var cell = IPython.notebook.get_cell(0);
cell.set_text('import time\nfor x in range(3):\n time.sleep(1)');
cell.execute();
});
// interrupt using menu item (Kernel -> Interrupt)
this.thenClick('li#int_kernel');
this.wait_for_output(0);
this.then(function () {
var result = this.get_output_cell(0);
this.test.assertEquals(result.ename, 'KeyboardInterrupt', 'keyboard interrupt (mouseclick)');
});
// run cell 0 again, now interrupting using keyboard shortcut
this.thenEvaluate(function () {
cell.clear_output();
cell.execute();
});
// interrupt using Ctrl-M I keyboard shortcut
this.thenEvaluate( function() {
IPython.utils.press_ghetto(IPython.utils.keycodes.I)
});
this.wait_for_output(0);
this.then(function () {
var result = this.get_output_cell(0);
this.test.assertEquals(result.ename, 'KeyboardInterrupt', 'keyboard interrupt (shortcut)');
});
});
For an example of how to pass parameters to the client-side javascript
from casper test suite, see the casper.wait_for_output
implementation in IPython/html/tests/casperjs/util.js
Testing system design notes
This section is a set of notes on the key points of the IPython testing
needs, that were used when writing the system and should be kept for
reference as it eveolves.
Testing IPython in full requires modifications to the default behavior
of nose and doctest, because the IPython prompt is not recognized to
determine Python input, and because IPython admits user input that is
not valid Python (things like %magics
and !system commands
.
We basically need to be able to test the following types of code:
In the first case, Nose will recognize the doctests as long as it is
called with the --with-doctest
flag. But the second case will likely
require modifications or the writing of a new doctest plugin for Nose
that is IPython-aware.
The first two cases are similar to the situation #2 above, except that
in this case the doctests must be extracted from input code blocks using
docutils instead of from the Python docstrings.
In the third case, we must have a convention for distinguishing code
blocks that are meant for execution from others that may be snippets of
shell code or other examples not meant to be run. One possibility is to
assume that all indented code blocks are meant for execution, but to
have a special docutils directive for input that should not be executed.
For those code blocks that we will execute, the convention used will
simply be that they get called and are considered successful if they run
to completion without raising errors. This is similar to what Nose does
for standalone test functions, and by putting asserts or other forms of
exception-raising statements it becomes possible to have literate
examples that double as lightweight tests.
Extension modules with doctests in function and method
docstrings. Currently Nose simply can’t find these docstrings
correctly, because the underlying doctest DocTestFinder object
fails there. Similarly to #2 above, the docstrings could have
either pure python or IPython prompts.
Of these, only 3-c (reST with standalone code blocks) is not implemented
at this point.
Steps for Releasing IPython
Attention
This is copied verbatim from the old IPython wiki and is currently under development. Much of the information in this part of the development guide is out of date.
This document contains notes about the process that is used to release
IPython. Our release process is currently not very formal and could be
improved.
Most of the release process is automated by the release
script in
the tools
directory of our main repository. This document is just a
handy reminder for the release manager.
0. Environment variables
You can set some env variables to note previous release tag and current
release milestone, version, and git tag:
PREV_RELEASE=rel-1.0.0
MILESTONE=1.1
VERSION=1.1.0
TAG="rel-$VERSION"
BRANCH=master
These will be used later if you want to copy/paste, or you can just type
the appropriate command when the time comes. These variables are not
used by scripts (hence no export
).
1. Finish release notes
If a major release:
merge any pull request notes into what’s new:
python tools/update_whatsnew.py
update docs/source/whatsnew/development.rst
, to ensure it covers
the major points.
move the contents of development.rst
to versionX.rst
generate summary of GitHub contributions, which can be done with:
python tools/github_stats.py --milestone $MILESTONE > stats.rst
which may need some manual cleanup. Add the cleaned up result and add it
to docs/source/whatsnew/github-stats-X.rst
(make a new file, or add
it to the top, depending on whether it is a major release). You can use:
git log --format="%aN <%aE>" $PREV_RELEASE... | sort -u -f
to find duplicates and update .mailmap
. Before generating the GitHub
stats, verify that all closed issues and pull requests have appropriate
milestones.
This
search
should return no results.
3. Create and push the new tag
Edit IPython/core/release.py
to have the current version.
Commit the changes to release.py and jsversion:
git commit -am "release $VERSION"
git push origin $BRANCH
Create and push the tag:
git tag -am "release $VERSION" "$TAG"
git push origin --tags
Update release.py back to x.y-dev
or x.y-maint
, and push:
git commit -am "back to development"
git push origin $BRANCH
4. Get a fresh clone of the tag for building the release:
cd /tmp
git clone --depth 1 https://github.com/ipython/ipython.git -b "$TAG"
5. Run the release
script
This makes the tarballs, zipfiles, and wheels. It posts them to
archive.ipython.org and registers the release with PyPI.
This will require that you have current wheel, Python 3.4 and Python
2.7.
7. Update the IPython website
release announcement (news, announcements)
update current version and download links
(If major release) update links on the documentation page
8. Drafting a short release announcement
This should include i) highlights and ii) a link to the html version of
the What’s new section of the documentation.
Post to mailing list, and link from Twitter.
9. Update milestones on GitHub
close the milestone you just released
open new milestone for (x, y+1), if it doesn’t exist already
10. Celebrate!
IPython Sphinx Directive
Attention
This is copied verbatim from the old IPython wiki and is currently under development. Much of the information in this part of the development guide is out of date.
The ipython directive is a stateful ipython shell for embedding in
sphinx documents. It knows about standard ipython prompts, and
extracts the input and output lines. These prompts will be renumbered
starting at 1
. The inputs will be fed to an embedded ipython
interpreter and the outputs from that interpreter will be inserted as
well. For example, code blocks like the following:
.. code:: python3
In [136]: x = 2
In [137]: x**3
Out[137]: 8
will be rendered as
In [136]: x = 2
In [137]: x**3
Out[137]: 8
Note
This tutorial should be read side-by-side with the Sphinx source
for this document because otherwise you will see only the rendered
output and not the code that generated it. Excepting the example
above, we will not in general be showing the literal ReST in this
document that generates the rendered output.
The state from previous sessions is stored, and standard error is
trapped. At doc build time, ipython’s output and std err will be
inserted, and prompts will be renumbered. So the prompt below should
be renumbered in the rendered docs, and pick up where the block above
left off.
In [138]: z = x*3 # x is recalled from previous block
In [139]: z
Out[139]: 6
In [140]: print z
--------> print(z)
6
In [141]: q = z[) # this is a syntax error -- we trap ipy exceptions
------------------------------------------------------------
File "<ipython console>", line 1
q = z[) # this is a syntax error -- we trap ipy exceptions
^
SyntaxError: invalid syntax
The embedded interpreter supports some limited markup. For example,
you can put comments in your ipython sessions, which are reported
verbatim. There are some handy “pseudo-decorators” that let you
doctest the output. The inputs are fed to an embedded ipython
session and the outputs from the ipython session are inserted into
your doc. If the output in your doc and in the ipython session don’t
match on a doctest assertion, an error will be
In [1]: x = 'hello world'
# this will raise an error if the ipython output is different
@doctest
In [2]: x.upper()
Out[2]: 'HELLO WORLD'
# some readline features cannot be supported, so we allow
# "verbatim" blocks, which are dumped in verbatim except prompts
# are continuously numbered
@verbatim
In [3]: x.st<TAB>
x.startswith x.strip
Multi-line input is supported.
In [130]: url = 'http://ichart.finance.yahoo.com/table.csv?s=CROX\
.....: &d=9&e=22&f=2009&g=d&a=1&br=8&c=2006&ignore=.csv'
In [131]: print url.split('&')
--------> print(url.split('&'))
['http://ichart.finance.yahoo.com/table.csv?s=CROX', 'd=9', 'e=22',
You can do doctesting on multi-line output as well. Just be careful
when using non-deterministic inputs like random numbers in the ipython
directive, because your inputs are ruin through a live interpreter, so
if you are doctesting random output you will get an error. Here we
“seed” the random number generator for deterministic output, and we
suppress the seed line so it doesn’t show up in the rendered output
In [133]: import numpy.random
@suppress
In [134]: numpy.random.seed(2358)
@doctest
In [135]: numpy.random.rand(10,2)
Out[135]:
array([[ 0.64524308, 0.59943846],
[ 0.47102322, 0.8715456 ],
[ 0.29370834, 0.74776844],
[ 0.99539577, 0.1313423 ],
[ 0.16250302, 0.21103583],
[ 0.81626524, 0.1312433 ],
[ 0.67338089, 0.72302393],
[ 0.7566368 , 0.07033696],
[ 0.22591016, 0.77731835],
[ 0.0072729 , 0.34273127]])
Another demonstration of multi-line input and output
In [106]: print x
--------> print(x)
jdh
In [109]: for i in range(10):
.....: print i
.....:
.....:
0
1
2
3
4
5
6
7
8
9
Most of the “pseudo-decorators” can be used an options to ipython
mode. For example, to setup matplotlib pylab but suppress the output,
you can do. When using the matplotlib use
directive, it should
occur before any import of pylab. This will not show up in the
rendered docs, but the commands will be executed in the embedded
interpreter and subsequent line numbers will be incremented to reflect
the inputs:
.. code:: python3
In [144]: from pylab import *
In [145]: ion()
In [144]: from pylab import *
In [145]: ion()
Likewise, you can set :doctest:
or :verbatim:
to apply these
settings to the entire block. For example,
In [9]: cd mpl/examples/
/home/jdhunter/mpl/examples
In [10]: pwd
Out[10]: '/home/jdhunter/mpl/examples'
In [14]: cd mpl/examples/<TAB>
mpl/examples/animation/ mpl/examples/misc/
mpl/examples/api/ mpl/examples/mplot3d/
mpl/examples/axes_grid/ mpl/examples/pylab_examples/
mpl/examples/event_handling/ mpl/examples/widgets
In [14]: cd mpl/examples/widgets/
/home/msierig/mpl/examples/widgets
In [15]: !wc *
2 12 77 README.txt
40 97 884 buttons.py
26 90 712 check_buttons.py
19 52 416 cursor.py
180 404 4882 menu.py
16 45 337 multicursor.py
36 106 916 radio_buttons.py
48 226 2082 rectangle_selector.py
43 118 1063 slider_demo.py
40 124 1088 span_selector.py
450 1274 12457 total
You can create one or more pyplot plots and insert them with the
@savefig
decorator.
@savefig plot_simple.png width=4in
In [151]: plot([1,2,3]);
# use a semicolon to suppress the output
@savefig hist_simple.png width=4in
In [151]: hist(np.random.randn(10000), 100);
In a subsequent session, we can update the current figure with some
text, and then resave
In [151]: ylabel('number')
In [152]: title('normal distribution')
@savefig hist_with_text.png width=4in
In [153]: grid(True)
You can also have function definitions included in the source.
In [3]: def square(x):
...: """
...: An overcomplicated square function as an example.
...: """
...: if x < 0:
...: x = abs(x)
...: y = x * x
...: return y
...:
Then call it from a subsequent section.
In [4]: square(3)
Out [4]: 9
In [5]: square(-2)
Out [5]: 4
Writing Pure Python Code
Pure python code is supported by the optional argument python. In this pure
python syntax you do not include the output from the python interpreter. The
following markup:
.. code:: python
foo = 'bar'
print foo
foo = 2
foo**2
Renders as
foo = 'bar'
print foo
foo = 2
foo**2
We can even plot from python, using the savefig decorator, as well as, suppress
output with a semicolon
@savefig plot_simple_python.png width=4in
plot([1,2,3]);
Similarly, std err is inserted
Comments are handled and state is preserved
# comments are handled
print foo
If you don’t see the next code block then the options work.
Multi-line input is handled.
line = 'Multi\
line &\
support &\
works'
print line.split('&')
Functions definitions are correctly parsed
def square(x):
"""
An overcomplicated square function as an example.
"""
if x < 0:
x = abs(x)
y = x * x
return y
And persist across sessions
print square(3)
print square(-2)
Pretty much anything you can do with the ipython code, you can do
with a simple python script. Obviously, though it doesn’t make sense
to use the doctest option.
Architecture of IPython notebook’s Dashboard
Attention
This is copied verbatim from the old IPython wiki and is currently under development. Much of the information in this part of the development guide is out of date.
The tables below show the current RESTful web service architecture
implemented in IPython notebook. The listed URL’s use the HTTP verbs to
return representations of the desired resource.
We are in the process of creating a new dashboard architecture for the
IPython notebook, which will allow the user to navigate through multiple
directory files to find desired notebooks.
Current Architecture
Miscellaneous
HTTP
verb |
URL |
Action |
GET
|
/.*/ |
Strips trailing slashes. |
GET
|
/api |
Returns api version information. |
*
|
/api/notebooks |
Deprecated: redirect to /api/contents |
GET
|
/api/nbconvert |
|
Notebook contents API.
HTTP
verb |
URL |
Action |
GET
|
/api/contents |
Return a model for the base directory.
See /api/contents/<path>/<file>. |
GET
|
/api/contents/
<file> |
Return a model for the given file in
the base directory. See
/api/contents/<path>/<file>. |
GET
|
/api/contents/
<path>/<file> |
Return a model for a file or
directory. A directory model contains
a list of models (without content) of
the files and directories it contains. |
PUT
|
/api/contents/
<path>/<file> |
Saves the file in the location
specified by name and path. PUT is
very similar to POST, but the
requester specifies the name, where as
with POST, the server picks the name.
PUT /api/contents/path/Name.ipynb Save
notebook at path/Name.ipynb .
Notebook structure is specified in
content key of JSON request body.
If content is not specified, create a
new empty notebook.
PUT /api/contents/path/Name.ipynb with
JSON body
{“copy_from” : “[path/to/]
OtherNotebook.ipynb”} Copy
OtherNotebook to Name |
PATCH
|
/api/contents/
<path>/<file> |
Renames a notebook without
re-uploading content. |
POST
|
/api/contents/
<path>/<file> |
Creates a new file or directory in the
specified path. POST creates new files
or directories. The server always
decides on the name.
POST /api/contents/path New untitled
notebook in path. If content
specified, upload a notebook,
otherwise start empty.
POST /api/contents/path with body
{“copy_from”:”OtherNotebook.ipynb”}
New copy of OtherNotebook in path |
DELETE
|
/api/contents/
<path>/<file> |
delete a file in the given path. |
GET
|
/api/contents/
<path>/<file>
/checkpoints |
get lists checkpoint for a file. |
POST
|
/api/contents/
<path>/<file>
/checkpoints |
post creates a new checkpoint. |
POST
|
/api/contents/
<path>/<file>
/checkpoints/
<checkpoint_
id> |
post restores a file from a
checkpoint. |
DELETE
|
/api/contents/
<path>/<file>
/checkpoints/
<checkpoint_
id> |
delete clears a checkpoint for a
given file. |
Kernel API
HTTP
verb |
URI |
Action |
GET
|
/api/kernels |
Return a model of all kernels. |
GET
|
/api/kernels
/<kernel_id> |
Return a model of kernel with given
kernel id. |
POST
|
/api/kernels |
Start a new kernel with default or
given name. |
DELETE
|
/api/kernels
/<kernel_id> |
Shutdown the given kernel. |
POST
|
/api/kernels
/<kernel_id>
/<action> |
Perform action on kernel with given
kernel id. Actions can be
“interrupt” or “restart”. |
WS
|
/api/kernels
/<kernel_id>
/channels |
Websocket stream |
GET
|
/api/kernel
specs |
Return a spec model of all available
kernels. |
GET
|
/api/kernel
specs/
<kernel_name> |
Return a spec model of all available
kernels with a given kernel name. |
Sessions API
HTTP
verb |
URL |
Action |
GET
|
/api/sesions |
Return model of active sessions. |
POST
|
/api/sessions |
If session does not already exist,
create a new session with given
notebook name and path and given
kernel name. Return active sesssion. |
GET
|
/api/sessions
/<session_id> |
Return model of active session with
given session id. |
PATCH
|
/api/sessions
/<session_id> |
Return model of active session with
notebook name or path of session with
given session id. |
DELETE
|
/api/sessions
/<session_id> |
Delete model of active session with
given session id. |
Clusters API
HTTP
verb |
URL |
Action |
GET
|
/api/clusters |
Return model of clusters. |
GET
|
/api/clusters
<cluster_id> |
Return model of given cluster. |
POST
|
/api/clusters
<cluster_id>
<action> |
Perform action with given clusters.
Valid actions are “start” and “stop” |
Old Architecture
This chart shows the web-services in the single directory IPython
notebook.
HTTP
verb |
URL |
Action |
GET
|
/notebooks |
return list of dicts with each
notebook’s info |
POST
|
/notebooks |
if sending a body, saving that body as
a new notebook; if no body, create a
a new notebook. |
GET
|
/notebooks
/<notebook_id> |
get JSON data for notebook |
PUT
|
/notebooks
/<notebook_id> |
saves an existing notebook with body
data |
DELETE
|
/notebooks
/<notebook_id> |
deletes the notebook with the given ID |
This chart shows the architecture for the IPython notebook website.
HTTP
verb |
URL |
Action |
GET
|
/ |
navigates user to dashboard of
notebooks and clusters. |
GET
|
/<notebook_id> |
go to wepage for that notebook |
GET
|
/new |
creates a new notebook with profile
(or default, if no profile exists)
setings |
GET
|
/<notebook_id>
/copy |
opens a duplicate copy or the notebook
with the given ID in a new tab |
GET
|
/<notebook_id>
/print |
prints the notebook with the given ID;
if notebook doesn’t exist, displays
error message |
GET
|
/login |
navigates to login page; if no user
profile is defined, it navigates user
to dashboard |
GET
|
/logout |
logs out of current profile, and
navigates user to login page |
This chart shows the Web services that act on the kernels and clusters.
HTTP
verb |
URL |
Action |
GET
|
/kernels |
return the list of kernel IDs
currently running |
GET
|
/kernels
/<kernel_id> |
— |
GET
|
/kernels
/<kernel_id>
<action> |
performs action (restart/kill) kernel
with given ID |
GET
|
/kernels
/<kernel_id>
/iopub |
— |
GET
|
/kernels
/<kernel_id>
/shell |
— |
GET
|
/rstservice/
render |
— |
GET
|
/files/(.*) |
— |
GET
|
/clusters |
returns a list of dicts with each
cluster’s information |
POST
|
/clusters
/<profile_id>
/<cluster_
action> |
performs action (start/stop) on
cluster with given profile ID |
GET
|
/clusters
/<profile_id> |
returns the JSON data for cluster with
given profile ID |
Setup IPython development environment using boot2docker
Attention
This is copied verbatim from the old IPython wiki and is currently under development. Much of the information in this part of the development guide is out of date.
The following are instructions on how to get an IPython development
environment up and running without having to install anything on your
host machine, other than
`boot2docker
<https://github.com/boot2docker/boot2docker>`__ and
`docker
<https://www.docker.com/>`__.
Install boot2docker
Install boot2docker.
There are multiple ways to install, depending on your environment. See
the boot2docker
docs.
Mac OS X
On a Mac OS X host with Homebrew installed:
$ brew install boot2docker docker
Initialize boot2docker
VM
Start VM
The boot2docker
CLI communicates with the docker daemon on the
boot2docker
VM. To do this, we must set some environment variables,
e.g. DOCKER_HOST
,
$ $(boot2docker shellinit)
To view the IP address of the VM:
$ boot2docker ip
192.168.59.103
Install ipython
from Development Branch
$ git clone --recursive https://github.com/ipython/ipython.git
Build Docker Image
Use the Dockerfile in the cloned ipython
directory to build a Docker
image.
$ cd ipython
$ docker build --rm -t ipython .
Run Docker Container
Run a container using the new image. We mount the entire ipython
source tree on the host into the container at /srv/ipython
to enable
changes we make to the source on the host immediately reflected in the
container.
# change to the root of the git clone
$ cd ipython
$ docker run -it --rm -p 8888:8888 --workdir /srv/ipython --name ipython-dev -v `pwd`:/srv/ipython ipython /bin/bash
To list the running container from another shell on the host:
$ $(boot2docker shellinit)
$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
f6065f206519 ipython "/bin/bash" 1 minutes ago Up 1 minutes 0.0.0.0:8888->8888/tcp ipython-dev
Install IPython in Editable Mode
Once in the container, you’ll need to uninstall the ipython
package
and re-install in editable mode to enable your dev changes to be
reflected in your environment.
container $ pip uninstall ipython
# pip install ipython in editable mode
container $ cd /srv
container $ ls
ipython
container $ pip install -e ipython
Run Notebook Server
container $ ipython notebook --no-browser --ip=*
Visit Notebook Server
On your host, run the following command to get the IP of the boot2docker
VM if you forgot:
# on host
$ boot2docker ip
192.168.59.103
Then visit it in your browser:
# browser
http://192.168.59.103:8888
As a shortcut on a Mac, you can run the following in a terminal window
(or make it a bash alias):
$ open http://$(boot2docker ip 2>/dev/null):8888
Testing Kernels
Attention
This is copied verbatim from the old IPython wiki and is currently under development. Much of the information in this part of the development guide is out of date.
IPython makes it very easy to create wrapper kernels using its kernel
framework. It requires extending the Kernel class and implementing a set
of methods for the core functions like execute, history etc. Its also
possible to write a full blown kernel in a language of your choice
implementing listeners for all the zmq ports.
The key problem for any kernel implemented by these methods is to ensure
that it meets the message specification. The kerneltest command is a
means to test the installed kernel against the message spec and validate
the results.
The test script file
The test script file is a simple json file that specifies the command to
execute and the test code to execute for the command.
{
"command":{
"test_code":<code>
}
}
For some commands in the message specification like kernel_info there
is no need to specify the test_code parameter. The tool validates if it
has all the inputs needed to execute the command and will print out an
error to the console if it finds a missing parameter. Since the
validation is built in, and only required parameters are passed, it is
possible to add additional fields in the json file for test
documentation.
{
"command":{
"test_name":"sample test",
"test_description":"sample test to show how the test script file is created",
"test_code":<code>
}
}
A sample test script for the redis kernel will look like this
{
"execute":{
"test_code":"get a",
"comments":"test basic code execution"
},
"complete":{
"test_code":"get",
"comments":"test getting command auto complete"
},
"kernel_info":{
"comments":"simple kernel info check"
},
"single_payload":{
"test_code":"get a",
"comments":"test one payload"
},
"history_tail":{
"test_code":"get a",
"comments":"test tail history"
},
"history_range":{
"test_code":"get a",
"comments":"test range history"
},
"history_search":{
"test_code":"get a",
"comments":"test search history"
}
}