3 A very simplistic remote-command-executor using connections to hosts (``ssh``,
4 local, containers, and several others are supported) and Python in the remote
7 All the heavy lifting is done by execnet, while this minimal API provides the
8 bare minimum to handle easy logging and connections from the remote end.
10 ``remoto`` is a bit opinionated as it was conceived to replace helpers and
11 remote utilities for ``ceph-deploy``, a tool to run remote commands to configure
12 and setup the distributed file system Ceph. `ceph-medic
13 <https://pypi.org/project/ceph-medic/>`_ uses remoto as well to inspect Ceph
19 The usage aims to be extremely straightforward, with a very minimal set of
20 helpers and utilities for remote processes and logging output.
22 The most basic example will use the ``run`` helper to execute a command on the
23 remote end. It does require a logging object, which needs to be one that, at
24 the very least, has both ``error`` and ``debug``. Those are called for
25 ``stderr`` and ``stdout`` respectively.
27 This is how it would look with a basic logger passed in::
29 >>> conn = remoto.Connection('hostname')
30 >>> run(conn, ['ls', '-a'])
31 INFO:hostname:Running command: ls -a
34 DEBUG:hostname:.bash_history
35 DEBUG:hostname:.bash_logout
36 DEBUG:hostname:.bash_profile
37 DEBUG:hostname:.bashrc
38 DEBUG:hostname:.lesshst
42 DEBUG:hostname:.viminfo
44 The ``run`` helper will display the ``stderr`` and ``stdout`` as ``ERROR`` and
45 ``DEBUG`` respectively.
47 For other types of usage (like checking exit status codes, or raising upon
48 them) ``remoto`` does provide them too.
56 Calling remote commands can be done in a few different ways. The most simple
57 one is with ``process.run``::
59 >>> from remoto.process import run
60 >>> from remoto import connection
61 >>> Connection = connection.get('ssh')
62 >>> conn = Connection('myhost')
63 >>> run(conn, ['whoami'])
64 INFO:myhost:Running command: whoami
67 Note however, that you are not capturing results or information from the remote
68 end. The intention here is only to be able to run a command and log its output.
69 It is a *fire and forget* call.
74 This callable, allows the caller to deal with the ``stderr``, ``stdout`` and
75 exit code. It returns it in a 3 item tuple::
77 >>> from remoto.process import check
78 >>> check(conn, ['ls', '/nonexistent/path'])
79 ([], ['ls: cannot access /nonexistent/path: No such file or directory'], 2)
81 Note that the ``stdout`` and ``stderr`` items are returned as lists with the ``\n``
84 This is useful if you need to process the information back locally, as opposed
85 to just firing and forgetting (while logging, like ``process.run``).
90 There are two supported ways to execute functions on the remote side. The
91 library that ``remoto`` uses to connect (``execnet``) only supports a few
92 backends *natively*, and ``remoto`` has extended this ability for other backend
93 connections like kubernetes.
95 The remote function capabilities are provided by ``LegacyModuleExecute`` and
96 ``JsonModuleExecute``. By default, both ``ssh`` and ``local`` connection will
97 use the legacy execution class, and everything else will use the ``legacy``
98 class. The ``ssh`` and ``local`` connections can still be forced to use the new
99 module execution by setting::
101 conn.remote_import_system = 'json'
106 The default module for ``docker``, ``kubernetes``, ``podman``, and
107 ``openshift``. It does not require any magic on the module to be executed,
108 however it is worth noting that the library *will* add the following bit of
109 magic when sending the module to the remote end for execution::
112 if __name__ == '__main__':
113 import json, traceback
114 obj = {'return': None, 'exception': None}
116 obj['return'] = function_name(*a)
118 obj['exception'] = traceback.format_exc()
120 print(json.dumps(obj).decode('utf-8'))
121 except AttributeError:
122 print(json.dumps(obj))
124 This allows the system to execute ``function_name`` (replaced by the real
125 function to be executed with its arguments), grab any results, serialize them
126 with ``json`` and send them back for local processing.
129 If you had a function in a module named ``foo`` that looks like this::
134 return os.listdir(path)
136 To be able to execute that ``listdir`` function remotely you would need to pass
137 the module to the connection object and then call that function::
140 >>> conn = Connection('hostname')
141 >>> remote_foo = conn.import_module(foo)
142 >>> remote_foo.listdir('.')
153 Note that functions to be executed remotely **cannot** accept objects as
154 arguments, just normal Python data structures, like tuples, lists and
155 dictionaries. Also safe to use are ints and strings.
160 When using the ``legacy`` execution model (the default for ``local`` and
161 ``ssh`` connections), modules are required to add the following to the end of
164 if __name__ == '__channelexec__':
166 channel.send(eval(item))
168 This piece of code is fully compatible with the ``json`` execution model, and
169 would not cause conflicts.
172 Automatic detection for ssh connections
173 ---------------------------------------
174 There is automatic detection for the need to connect remotely (via SSH) or not
175 that it is infered by the hostname of the current host (vs. the host that is
178 If the local host has the same as the remote hostname, a local connection (via
179 `Popen`) will be opened and that will be used instead of `ssh`, and avoiding
180 the issues of being able to ssh into the same host.
182 Automatic detection for using `sudo`
183 ------------------------------------
184 This magical detection can be enabled by using the `detect_sudo` flag in the
185 `Connection` class. It is disabled by default.
187 When enabled, it will prefix any command with `sudo`. This is useful for
188 libraries that need super user permissions and want to avoid passing `sudo`
189 everywhere, which can be non-trivial if dealing with `root` users that are