Description
Hi, In tensorflow-wavenet, inappropriate dependency versioning constraints can cause risks.
Below are the dependencies and version constraints that the project is using
librosa>=0.5
tensorflow>=1.0.0
The version constraint == will introduce the risk of dependency conflicts because the scope of dependencies is too strict.
The version constraint No Upper Bound and * will introduce the risk of the missing API Error because the latest version of the dependencies may remove some APIs.
After further analysis, in this project,
The version constraint of dependency librosa can be changed to >=0.2.0,<=0.7.2.
The above modification suggestions can reduce the dependency conflicts as much as possible,
and introduce the latest version as much as possible without calling Error in the projects.
The invocation of the current project includes all the following methods.
The calling methods from the librosa
librosa.output.write_wav
The calling methods from the all methods
sum tf.trainable_variables tf.Variable tf.nn.embedding_lookup np.logaddexp.reduce coord.join enumerate np.random.randint self._generator_conv net.predict_proba self._generator_causal_layer self.coord.should_stop tf.RunOptions os.makedirs q.enqueue_many tf.train.AdamOptimizer audio_reader.trim_silence args.optimizer.optimizer_factory tf.summary.audio q.dequeue tf.train.MomentumOptimizer argparse.ArgumentTypeError create_variable self._one_hot tf.histogram_summary get_arguments tf.div tf.zeros tf.sigmoid sys.stdout.flush initializer tf.train.Saver time_to_batch writer.add_graph librosa.load tf.summary.merge_all reader.dequeue np.nonzero WaveNetModel.calculate_receptive_field get_default_logdir load_generic_audio tf.train.get_checkpoint_state np.seterr self._create_generator tf.size open librosa.output.write_wav var.append dict audio.reshape f.write create_bias_variable np.arange self._generator_dilation_layer find_files tf.pad os.path.join optimizer.minimize np.array tf.constant trim_silence write_wav net.loss tf.RunMetadata abs tf.constant_initializer saver.restore list self._embed_gc randomize_files tf.PaddingFIFOQueue id_reg_expression.findall self._create_variables waveform.append tf.global_variables np.pad parser.parse_args self._create_causal_layer tf.cond tf.shape tf.transpose np.testing.assert_allclose float batch_to_time np.reshape sess.run tf.placeholder tf.add_n self._create_dilation_layer int len tf.nn.conv1d WaveNetModel librosa.core.frames_to_samples tf.slice ckpt.model_checkpoint_path.split thread.start create_seed threading.Thread tf.nn.l2_loss fnmatch.filter ckpt.model_checkpoint_path.split.split self.threads.append tf.get_default_graph tf.Session tf.summary.FileWriter tf.name_scope tf.nn.softmax tf.to_float tf.nn.softmax_cross_entropy_with_logits not_all_have_id tf.cast tf.add datetime.now time_since_print.total_seconds mu_law_decode os.walk np.identity tf.one_hot parser.add_argument tf.train.RMSPropOptimizer self.queue.dequeue_many json.load format s.lower tf.ConfigProto tf.to_int32 librosa.feature.rmse create_embedding_table causal_conv tf.global_variables_initializer datetime.now.str.replace main push_ops.append save global_condition.get_shape str self.gc_queue.dequeue_many tf.train.start_queue_runners load outputs.extend q.enqueue tf.sign tf.FIFOQueue mu_law_encode net.predict_proba_incremental tf.train.Coordinator id_reg_exp.findall time.time get_category_cardinality print tl.generate_chrome_trace_format np.exp argparse.ArgumentParser writer.add_run_metadata self._create_network seed.sess.run.tolist NotImplementedError ValueError tf.reshape AudioReader range files.append os.path.exists tf.tanh np.random.choice tf.summary.scalar saver.save re.compile tf.nn.relu tf.reduce_mean random.randint tf.minimum reader.dequeue_gc timeline.Timeline np.log tf.abs init_ops.append reader.start_threads optimizer_factory.keys self.queue.enqueue tf.variable_scope self.gc_queue.enqueue validate_directories tf.matmul outputs.append tf.log1p tf.contrib.layers.xavier_initializer_conv2d writer.add_summary coord.request_stop
@developer
Could please help me check this issue?
May I pull a request to fix it?
Thank you very much.