Reusable Deep Neural Networks:

Applications to Biomedical Data

Generate script to run Deep Transfer Learning Ensemble

Deep Transfer Ensemble
Deep Transfer Ensemble

Enter target dataset and path used for the experiment

Path of the dataset located in Theano pickled format
Target dataset
Source dataset
Path of the source model. (The transfer learning results are stored in the same path)
Source reuse mode Eg: (PT or PT+FT):
Retrain layers during fine-tuning: Eg ([0,0,1] enter 1 to retrain and 0 to lock)
Tranfer hidden layer from source: Eg ([1,1,1] enter 1 to transfer and 0 for not to transfer)
Path to store the results
File name of the console output
Enter number times to repeate the experiment
Enter the GPU number to use

Enter Stacked Denoising Autoencoders traning parameters

Fine-tuning learning rate
Max number of Fine-tuning epochs
Pre-training learning rate
Max number of Pre-training epochs
Number of neurons at each hidden layer comma seperated for each layer eg: [500,500,500]
Mini batch size
Random initial seed number
Fraction of the total training data

Now Copy and Paste the following into terminal and hit return to prepare tiles:


taskset -c 0 nohup python online_input.py TL data_path target_data fold target_data fold target_data fold fold results_dir nr_reps gpu_nr finetune_lr training_epochs pretrain_lr pretraining_epochs hidden_layers_sizes batch_size rng_seed training_data_fraction > results_dir result_file_name 2>&1 &

This project is financed by FEDER funds through the Programa Operacional Factores de Competitividade COMPETE and by Portuguese funds through FCT Fundação para a Ciência e a Tecnologia in the framework.