DDR爱好者之家 Design By 杰米

如下所示:

 
from multiprocessing import Process
import os
 
 
def training_function(...):
 import keras # 此处需要在子进程中
 ...
 
if __name__ == '__main__':
 p = Process(target=training_function, args=(...,))
 p.start()

原文地址:https://stackoverflow.com/questions/42504669/keras-tensorflow-and-multiprocessing-in-python

1、DO NOT LOAD KERAS TO YOUR MAIN ENVIRONMENT. If you want to load Keras / Theano / TensorFlow do it only in the function environment. E.g. don't do this:

import keras
 
def training_function(...):
 ...

but do the following:

def training_function(...):
 import keras
 ...

Run work connected with each model in a separate process: I'm usually creating workers which are making the job (like e.g. training, tuning, scoring) and I'm running them in separate processes. What is nice about it that whole memory used by this process is completely freedwhen your process is done. This helps you with loads of memory problems which you usually come across when you are using multiprocessing or even running multiple models in one process. So this looks e.g. like this:

def _training_worker(train_params):
 import keras
 model = obtain_model(train_params)
 model.fit(train_params)
 send_message_to_main_process(...)
 
def train_new_model(train_params):
 training_process = multiprocessing.Process(target=_training_worker, args = train_params)
 training_process.start()
 get_message_from_training_process(...)
 training_process.join()

Different approach is simply preparing different scripts for different model actions. But this may cause memory errors especially when your models are memory consuming. NOTE that due to this reason it's better to make your execution strictly sequential.

以上这篇keras tensorflow 实现在python下多进程运行就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持。

DDR爱好者之家 Design By 杰米
广告合作:本站广告合作请联系QQ:858582 申请时备注:广告合作(否则不回)
免责声明:本站资源来自互联网收集,仅供用于学习和交流,请遵循相关法律法规,本站一切资源不代表本站立场,如有侵权、后门、不妥请联系本站删除!
DDR爱好者之家 Design By 杰米