io#

Common I/O utilities.

Classes#

DeltaSecondsFormatter

Logging formatter with additional attributes for run time logging.

ContextDecorator

Base class for a context manager class (implementing __enter__() and __exit__()) that also

SwallowBrokenPipe

Base class for a context manager class (implementing __enter__() and __exit__()) that also

CaptureTarget

Constants used for contextmanager captured.

Spinner

param message:

A message to prefix the spinner with. The string ': ' is automatically appended.

ProgressBar

DummyExecutor

This is an abstract base class for concrete asynchronous executors.

ThreadLimitedThreadPoolExecutor

This is an abstract base class for concrete asynchronous executors.

time_recorder

Base class for a context manager class (implementing __enter__() and __exit__()) that also

Functions#

dashlist(iterable[, indent])

env_vars([var_map, callback, stack_callback])

env_var(name, value[, callback, stack_callback])

env_unmodified([callback])

captured([stdout, stderr])

Capture outputs of sys.stdout and sys.stderr.

argv(args_list)

_logger_lock()

disable_logger(logger_name)

stderr_log_level(level[, logger_name])

attach_stderr_handler([level, logger_name, propagate, ...])

Attach a new stderr handler to the given logger and configure both.

timeout(timeout_secs, func, *args[, default_return])

Enforce a maximum time for a callable to complete.

get_instrumentation_record_file()

print_instrumentation_data()

Attributes#

IS_INTERACTIVE#
class DeltaSecondsFormatter(fmt=None, datefmt=None)#

Bases: logging.Formatter

Logging formatter with additional attributes for run time logging.

`delta_secs`

Elapsed seconds since last log/format call (or creation of logger).

`relative_created_secs`

Like relativeCreated, time relative to the initialization of the logging module but conveniently scaled to seconds as a float value.

format(record)#

Format the specified record as text.

The record's attribute dictionary is used as the operand to a string formatting operation which yields the returned string. Before formatting the dictionary, a couple of preparatory steps are carried out. The message attribute of the record is computed using LogRecord.getMessage(). If the formatting string uses the time (as determined by a call to usesTime(), formatTime() is called to format the event time. If there is exception information, it is formatted using formatException() and appended to the message.

_FORMATTER#
dashlist(iterable, indent=2)#
class ContextDecorator#

Base class for a context manager class (implementing __enter__() and __exit__()) that also makes it a decorator.

__call__(f)#
class SwallowBrokenPipe#

Bases: ContextDecorator

Base class for a context manager class (implementing __enter__() and __exit__()) that also makes it a decorator.

__enter__()#
__exit__(exc_type, exc_val, exc_tb)#
swallow_broken_pipe#
class CaptureTarget#

Bases: enum.Enum

Constants used for contextmanager captured.

Used similarly like the constants PIPE, STDOUT for stdlib's subprocess.Popen.

STRING#
STDOUT#
env_vars(var_map=None, callback=None, stack_callback=None)#
env_var(name, value, callback=None, stack_callback=None)#
env_unmodified(callback=None)#
captured(stdout=CaptureTarget.STRING, stderr=CaptureTarget.STRING)#

Capture outputs of sys.stdout and sys.stderr.

If stdout is STRING, capture sys.stdout as a string, if stdout is None, do not capture sys.stdout, leaving it untouched, otherwise redirect sys.stdout to the file-like object given by stdout.

Behave correspondingly for stderr with the exception that if stderr is STDOUT, redirect sys.stderr to stdout target and set stderr attribute of yielded object to None.

>>> from conda.common.io import captured
>>> with captured() as c:
...     print("hello world!")
...
>>> c.stdout
'hello world!\n'
Parameters:
  • stdout -- capture target for sys.stdout, one of STRING, None, or file-like object

  • stderr -- capture target for sys.stderr, one of STRING, STDOUT, None, or file-like object

Yields:

CapturedText --

has attributes stdout, stderr which are either strings, None or the

corresponding file-like function argument.

argv(args_list)#
_logger_lock()#
disable_logger(logger_name)#
stderr_log_level(level, logger_name=None)#
attach_stderr_handler(level=WARN, logger_name=None, propagate=False, formatter=None, filters=None)#

Attach a new stderr handler to the given logger and configure both.

This function creates a new StreamHandler that writes to stderr and attaches it to the logger given by logger_name (which maybe None, in which case the root logger is used). If the logger already has a handler by the name of stderr, it is removed first.

The given level is set for the handler, not for the logger; however, this function also sets the level of the given logger to the minimum of its current effective level and the new handler level, ensuring that the handler will receive the required log records, while minimizing the number of unnecessary log events. It also sets the loggers propagate property according to the propagate argument. The formatter argument can be used to set the formatter of the handler.

timeout(timeout_secs, func, *args, default_return=None, **kwargs)#

Enforce a maximum time for a callable to complete. Not yet implemented on Windows.

class Spinner(message, enabled=True, json=False, fail_message='failed\n')#
Parameters:
  • message (str) -- A message to prefix the spinner with. The string ': ' is automatically appended.

  • enabled (bool) -- If False, usage is a no-op.

  • json (bool) -- If True, will not output non-json to stdout.

spinner_cycle#
start()#
stop()#
_start_spinning()#
__enter__()#
__exit__(exc_type, exc_val, exc_tb)#
class ProgressBar(description, enabled=True, json=False, position=None, leave=True)#
classmethod get_lock()#
update_to(fraction)#
finish()#
refresh()#

Force refresh i.e. once 100% has been reached

close()#
static _tqdm(*args, **kwargs)#

Deferred import so it doesn't hit the conda activate paths.

class DummyExecutor#

Bases: concurrent.futures.Executor

This is an abstract base class for concrete asynchronous executors.

submit(fn, *args, **kwargs)#

Submits a callable to be executed with the given arguments.

Schedules the callable to be executed as fn(*args, **kwargs) and returns a Future instance representing the execution of the callable.

Returns:

A Future representing the given call.

map(func, *iterables)#

Returns an iterator equivalent to map(fn, iter).

Parameters:
  • fn -- A callable that will take as many arguments as there are passed iterables.

  • timeout -- The maximum number of seconds to wait. If None, then there is no limit on the wait time.

  • chunksize -- The size of the chunks the iterable will be broken into before being passed to a child process. This argument is only used by ProcessPoolExecutor; it is ignored by ThreadPoolExecutor.

Returns:

map(func, *iterables) but the calls may be evaluated out-of-order.

Return type:

An iterator equivalent to

Raises:
  • TimeoutError -- If the entire result iterator could not be generated before the given timeout.

  • Exception -- If fn(*args) raises for any values.

shutdown(wait=True)#

Clean-up the resources associated with the Executor.

It is safe to call this method several times. Otherwise, no other methods can be called after this one.

Parameters:
  • wait -- If True then shutdown will not return until all running futures have finished executing and the resources used by the executor have been reclaimed.

  • cancel_futures -- If True then shutdown will cancel all pending futures. Futures that are completed or running will not be cancelled.

class ThreadLimitedThreadPoolExecutor(max_workers=10)#

Bases: concurrent.futures.ThreadPoolExecutor

This is an abstract base class for concrete asynchronous executors.

submit(fn, *args, **kwargs)#

This is an exact reimplementation of the submit() method on the parent class, except with an added try/except around self._adjust_thread_count(). So long as there is at least one living thread, this thread pool will not throw an exception if threads cannot be expanded to max_workers.

In the implementation, we use "protected" attributes from concurrent.futures (_base and _WorkItem). Consider vendoring the whole concurrent.futures library as an alternative to these protected imports.

agronholm/pythonfutures # NOQA python/cpython

as_completed#
get_instrumentation_record_file()#
class time_recorder(entry_name=None, module_name=None)#

Bases: ContextDecorator

Base class for a context manager class (implementing __enter__() and __exit__()) that also makes it a decorator.

record_file#
start_time#
total_call_num#
total_run_time#
_set_entry_name(f)#
__call__(f)#
__enter__()#
__exit__(exc_type, exc_val, exc_tb)#
classmethod log_totals()#
_ensure_dir()#
print_instrumentation_data()#