German pretrained model
# tacotron-2-support
u
German pretrained model
@hecko So pikachu told me to ask you about this because you're knowledgeable But i started training a single speaker german pretrained model on ipa, and its synthesis output has just been static no matter what notebook or pretrained model i train it on I can send an example of the static if needed, and the dataset if needed I accordingly modified the cleaners and symbol set, and this did not happen with the ipa french base model i trained with the same method
h
hmhm
i haven't touched ipa ever so if it's related to that then idk
could you send a copy of the notebook you used with all the settings in it
u
Yeah i'll do that
When i get home
I tried my notebook, as well as the notebook in #841437191073955920
Legacy btw
h
oh legacy huh
definitely haven't tried ipa on legacy
oh and also send me a few of the wav·s
u
I'll send the whole thing if you want, or just 3 ish?
Also i'm going to try training with a different dataset to see if the same thing happens
h
yeah just a few, i don't wanna burn through my data cap
just wanna make sure they're formatted right because it's really easy to forget one of the settings
u
Yeah i checked but maybe i'm forgetting something weird
sorry for the delay
h
hmm all seems fine
u
yeah thats what i thought
h
then it's probably an issue with the transcripts
u
it outputs static like this
h
or the settings
u
this is the training transcript
i'll make a copy of the notebook with the settings i used
ah my message about the symbol set got deleted as spam
here we go this was under 'letters'
cleaners were set to a new definition called "return_text"
Copy code
def return_text(text):
  return text
h
not sure if this is the issue but it's weird that the min_learning_rate is higher than learning_rate
-oh maybe
hparams.ignore_layers
not sure what to set it to since the legacy layout might be different
in pipeline it's
["embedding.weight"]
u
embedding.weight was being ignored i think I'll redo training and make sure of that
Wait it is?
Oh I put it in backwards i think
I think i meant for the values to be switched around