dcase_framework.learners.SceneClassifierMLP

class dcase_framework.learners.SceneClassifierMLP(*args, **kwargs)[source]

Scene classifier with MLP

This learner is a simple MLP based learner using Keras neural network implementation and sequential API. See documentation.

Learner parameters

Field name Value type Description
seed int Randomization seed. Use this to make learner behaviour deterministic.
keras
backend string {theano | tensorflow} Keras backend selector.
keras->backend_parameters
device string {cpu | gpu} Device selector. cpu is best option to produce deterministic results. All baseline results are calculated in cpu mode.
floatX string Float number type. Usually float32 used since that is compatible with GPUs. Valid only for theano backend.
fastmath bool If true, will enable fastmath mode when CUDA code is compiled. Div and sqrt are faster, but precision is lower. This can cause numerical issues some in cases. Valid only for theano backend and GPU mode.
optimizer string {fast_run | merge | fast_compile None} Compilation mode for theano functions.
openmp bool If true, Theano will use multiple cores, see more
threads int Number of threads used. Use one to disable threading.
CNR bool Conditional numerical reproducibility for MKL BLAS. When set to True, compatible mode used. See more.
validation
enable bool If true, validation set is used during the training procedure.
setup_source string

Validation setup source. Valid sources:

  • generated_scene_balanced, balanced based on scene labels, used for Task1.
  • generated_event_file_balanced, balanced based on events, used for Task2.
  • generated_scene_location_event_balanced, balanced based on scene, location and events. Used for Task3.
validation_amount float Percentage of training data selected for validation. Use value between 0.0-1.0.
seed int Validation set generation seed. If None, learner seed will be used.
training
epochs int Number of epochs.
batch_size int Batch size.
shuffle bool If true, training samples are shuffled at each epoch.
training->callbacks, list of parameter sets in following format. Callback called during the model training.
type string Callback name, use standard keras callbacks callbacks or ones defined by dcase_framework (Plotter, Stopper, Stasher).
parameters dict Place inside this all parameters for the callback.
training->model->config, list of dicts. Defining network topology.
class_name string Layer name. Use standard keras core layers, convolutional layers, pooling layers, recurrent layers, or normalization layers
config dict

Place inside this all parameters for the layer. See Keras documentation. Magic parameter values:

  • FEATURE_VECTOR_LENGTH, feature vector length. This automatically inserted for input layer.
  • CLASS_COUNT, number of classes.
input_shape list of ints List of integers which is converted into tuple before giving to Keras layer.
training->model
loss string Keras loss function name. See Keras documentation.
metrics list of strings Keras metric function name. See Keras documentation.
training->model->optimizer
type string Keras optimizer name. See Keras documentation.
parameters dict Place inside this all parameters for the optimizer.
__init__(*args, **kwargs)[source]

Methods

__init__(\*args, \*\*kwargs)
clear(() -> None.  Remove all items from D.)
copy(() -> a shallow copy of D)
create_callback_list() Create list of Keras callbacks
create_external_metric_evaluators() Create external metric evaluators
create_model(input_shape) Create sequential Keras model
detect_file_format(filename) Detect file format from extension
empty() Check if file is empty
exists() Checks that file exists
fromkeys(...) v defaults to None.
get((k[,d]) -> D[k] if k in D, ...)
get_dump_content(data) Clean internal content for saving
get_file_information() Get file information, filename
get_hash([data]) Get unique hash string (md5) for given parameter dict
get_hash_for_path([dotted_path])
get_path(dotted_path[, default, data]) Get value from nested dict with dotted path
get_processing_interval() Processing interval
has_key((k) -> True if D has a key k, else False)
items(() -> list of D’s (key, value) pairs, ...)
iteritems(() -> an iterator over the (key, ...)
iterkeys(() -> an iterator over the keys of D)
itervalues(...)
keras_model_exists() Check that keras model exists on disk
keys(() -> list of D’s keys)
learn(data, annotations[, data_filenames, ...]) Learn based on data and annotations
load(\*args, \*\*kwargs) Load file
log([level]) Log container content
log_model_summary() Prints model summary to the logging interface.
merge(override[, target]) Recursive dict merge
plot_model([filename, show_shapes, ...]) Plots model topology
pop((k[,d]) -> v, ...) If key is not found, d is returned if given, otherwise KeyError is raised
popitem(() -> (k, v), ...) 2-tuple; but raise KeyError if D is empty.
predict(feature_data) Predict frame probabilities for given feature matrix
prepare_activity(activity_matrix_dict, files) Concatenate activity matrices into one activity matrix
prepare_data(data, files[, processor]) Concatenate feature data into one feature matrix
save(\*args, \*\*kwargs) Save file
set_path(dotted_path, new_value[, data]) Set value in nested dict with dotted path
set_seed([seed]) Set randomization seeds
setdefault((k[,d]) -> D.get(k,d), ...)
show() Print container content
update(([E, ...) If E present and has a .keys() method, does: for k in E: D[k] = E[k]
values(() -> list of D’s values)
viewitems(...)
viewkeys(...)
viewvalues(...)

Attributes

class_labels Class labels
feature_aggregator Feature aggregator instance
feature_masker Feature masker instance
feature_normalizer Feature normalizer instance
feature_stacker Feature stacker instance
learner_params Get learner parameters from parameter container
method Learner method label
model Acoustic model
params Parameters
valid_formats