Spaces:
Sleeping
Sleeping
upload hifigan
Browse files- hifigan/LICENSE +21 -0
- hifigan/LJSpeech-1.1/training.txt +0 -0
- hifigan/LJSpeech-1.1/validation.txt +150 -0
- hifigan/README.md +105 -0
- hifigan/__pycache__/env.cpython-310.pyc +0 -0
- hifigan/__pycache__/models.cpython-310.pyc +0 -0
- hifigan/__pycache__/utils.cpython-310.pyc +0 -0
- hifigan/config_v1.json +37 -0
- hifigan/env.py +15 -0
- hifigan/inference.py +95 -0
- hifigan/meldataset.py +168 -0
- hifigan/models.py +283 -0
- hifigan/train.py +271 -0
- hifigan/utils.py +58 -0
hifigan/LICENSE
ADDED
@@ -0,0 +1,21 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
MIT License
|
2 |
+
|
3 |
+
Copyright (c) 2020 Jungil Kong
|
4 |
+
|
5 |
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6 |
+
of this software and associated documentation files (the "Software"), to deal
|
7 |
+
in the Software without restriction, including without limitation the rights
|
8 |
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9 |
+
copies of the Software, and to permit persons to whom the Software is
|
10 |
+
furnished to do so, subject to the following conditions:
|
11 |
+
|
12 |
+
The above copyright notice and this permission notice shall be included in all
|
13 |
+
copies or substantial portions of the Software.
|
14 |
+
|
15 |
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16 |
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17 |
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18 |
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19 |
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20 |
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
21 |
+
SOFTWARE.
|
hifigan/LJSpeech-1.1/training.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|
hifigan/LJSpeech-1.1/validation.txt
ADDED
@@ -0,0 +1,150 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
LJ050-0269|The essential terms of such memoranda might well be embodied in an Executive order.|The essential terms of such memoranda might well be embodied in an Executive order.
|
2 |
+
LJ050-0270|This Commission can recommend no procedures for the future protection of our Presidents which will guarantee security.|This Commission can recommend no procedures for the future protection of our Presidents which will guarantee security.
|
3 |
+
LJ050-0271|The demands on the President in the execution of His responsibilities in today's world are so varied and complex|The demands on the President in the execution of His responsibilities in today's world are so varied and complex
|
4 |
+
LJ050-0272|and the traditions of the office in a democracy such as ours are so deep-seated as to preclude absolute security.|and the traditions of the office in a democracy such as ours are so deep-seated as to preclude absolute security.
|
5 |
+
LJ050-0273|The Commission has, however, from its examination of the facts of President Kennedy's assassination|The Commission has, however, from its examination of the facts of President Kennedy's assassination
|
6 |
+
LJ050-0274|made certain recommendations which it believes would, if adopted,|made certain recommendations which it believes would, if adopted,
|
7 |
+
LJ050-0275|materially improve upon the procedures in effect at the time of President Kennedy's assassination and result in a substantial lessening of the danger.|materially improve upon the procedures in effect at the time of President Kennedy's assassination and result in a substantial lessening of the danger.
|
8 |
+
LJ050-0276|As has been pointed out, the Commission has not resolved all the proposals which could be made. The Commission nevertheless is confident that,|As has been pointed out, the Commission has not resolved all the proposals which could be made. The Commission nevertheless is confident that,
|
9 |
+
LJ050-0277|with the active cooperation of the responsible agencies and with the understanding of the people of the United States in their demands upon their President,|with the active cooperation of the responsible agencies and with the understanding of the people of the United States in their demands upon their President,
|
10 |
+
LJ050-0278|the recommendations we have here suggested would greatly advance the security of the office without any impairment of our fundamental liberties.|the recommendations we have here suggested would greatly advance the security of the office without any impairment of our fundamental liberties.
|
11 |
+
LJ001-0028|but by printers in Strasburg, Basle, Paris, Lubeck, and other cities.|but by printers in Strasburg, Basle, Paris, Lubeck, and other cities.
|
12 |
+
LJ001-0068|The characteristic Dutch type, as represented by the excellent printer Gerard Leew, is very pronounced and uncompromising Gothic.|The characteristic Dutch type, as represented by the excellent printer Gerard Leew, is very pronounced and uncompromising Gothic.
|
13 |
+
LJ002-0149|The latter indeed hung like millstones round the neck of the unhappy insolvent wretches who found themselves in limbo.|The latter indeed hung like millstones round the neck of the unhappy insolvent wretches who found themselves in limbo.
|
14 |
+
LJ002-0157|and Susannah Evans, in October the same year, for 2 shillings, with costs of 6 shillings, 8 pence.|and Susannah Evans, in October the same year, for two shillings, with costs of six shillings, eight pence.
|
15 |
+
LJ002-0167|quotes a case which came within his own knowledge of a boy sent to prison for non-payment of one penny.|quotes a case which came within his own knowledge of a boy sent to prison for non-payment of one penny.
|
16 |
+
LJ003-0042|The completion of this very necessary building was, however, much delayed for want of funds,|The completion of this very necessary building was, however, much delayed for want of funds,
|
17 |
+
LJ003-0307|but as yet no suggestion was made to provide prison uniform.|but as yet no suggestion was made to provide prison uniform.
|
18 |
+
LJ004-0169|On the dirty bedstead lay a wretched being in the throes of severe illness.|On the dirty bedstead lay a wretched being in the throes of severe illness.
|
19 |
+
LJ004-0233|Under the new rule visitors were not allowed to pass into the interior of the prison, but were detained between the grating.|Under the new rule visitors were not allowed to pass into the interior of the prison, but were detained between the grating.
|
20 |
+
LJ005-0101|whence it deduced the practice and condition of every prison that replied.|whence it deduced the practice and condition of every prison that replied.
|
21 |
+
LJ005-0108|the prisoners, without firing, bedding, or sufficient food, spent their days "in surveying their grotesque prison,|the prisoners, without firing, bedding, or sufficient food, spent their days "in surveying their grotesque prison,
|
22 |
+
LJ005-0202|An examination of this report shows how even the most insignificant township had its jail.|An examination of this report shows how even the most insignificant township had its jail.
|
23 |
+
LJ005-0234|The visits of friends was once more unreservedly allowed, and these incomers freely brought in extra provisions and beer.|The visits of friends was once more unreservedly allowed, and these incomers freely brought in extra provisions and beer.
|
24 |
+
LJ005-0248|and stated that in his opinion Newgate, as the common jail of Middlesex, was wholly inadequate to the proper confinement of its prisoners.|and stated that in his opinion Newgate, as the common jail of Middlesex, was wholly inadequate to the proper confinement of its prisoners.
|
25 |
+
LJ006-0001|The Chronicles of Newgate, Volume 2. By Arthur Griffiths. Section 9: The first report of the inspector of prisons.|The Chronicles of Newgate, Volume two. By Arthur Griffiths. Section nine: The first report of the inspector of prisons.
|
26 |
+
LJ006-0018|One was Mr. William Crawford, the other the Rev. Whitworth Russell.|One was Mr. William Crawford, the other the Rev. Whitworth Russell.
|
27 |
+
LJ006-0034|They attended early and late; they mustered the prisoners, examined into their condition,|They attended early and late; they mustered the prisoners, examined into their condition,
|
28 |
+
LJ006-0078|A new prisoner's fate, as to location, rested really with a powerful fellow-prisoner.|A new prisoner's fate, as to location, rested really with a powerful fellow-prisoner.
|
29 |
+
LJ007-0217|They go on to say|They go on to say
|
30 |
+
LJ007-0243|It was not till the erection of the new prison at Holloway in 1850, and the entire internal reconstruction of Newgate according to new ideas,|It was not till the erection of the new prison at Holloway in eighteen fifty, and the entire internal reconstruction of Newgate according to new ideas,
|
31 |
+
LJ008-0087|The change from Tyburn to the Old Bailey had worked no improvement as regards the gathering together of the crowd or its demeanor.|The change from Tyburn to the Old Bailey had worked no improvement as regards the gathering together of the crowd or its demeanor.
|
32 |
+
LJ008-0131|the other he kept between his hands.|the other he kept between his hands.
|
33 |
+
LJ008-0140|Whenever the public attention had been specially called to a particular crime, either on account of its atrocity,|Whenever the public attention had been specially called to a particular crime, either on account of its atrocity,
|
34 |
+
LJ008-0158|The pressure soon became so frightful that many would have willingly escaped from the crowd; but their attempts only increased the general confusion.|The pressure soon became so frightful that many would have willingly escaped from the crowd; but their attempts only increased the general confusion.
|
35 |
+
LJ008-0174|One cart-load of spectators having broken down, some of its occupants fell off the vehicle, and were instantly trampled to death.|One cart-load of spectators having broken down, some of its occupants fell off the vehicle, and were instantly trampled to death.
|
36 |
+
LJ010-0047|while in 1850 Her Majesty was the victim of another outrage at the hands of one Pate.|while in eighteen fifty Her Majesty was the victim of another outrage at the hands of one Pate.
|
37 |
+
LJ010-0061|That some thirty or more needy men should hope to revolutionize England is a sufficient proof of the absurdity of their attempt.|That some thirty or more needy men should hope to revolutionize England is a sufficient proof of the absurdity of their attempt.
|
38 |
+
LJ010-0105|Thistlewood was discovered next morning in a mean house in White Street, Moorfields.|Thistlewood was discovered next morning in a mean house in White Street, Moorfields.
|
39 |
+
LJ010-0233|Here again probably it was partly the love of notoriety which was the incentive,|Here again probably it was partly the love of notoriety which was the incentive,
|
40 |
+
LJ010-0234|backed possibly with the hope that, as in a much more recent case,|backed possibly with the hope that, as in a much more recent case,
|
41 |
+
LJ010-0258|As the Queen was driving from Buckingham Palace to the Chapel Royal,|As the Queen was driving from Buckingham Palace to the Chapel Royal,
|
42 |
+
LJ010-0262|charged him with the offense.|charged him with the offense.
|
43 |
+
LJ010-0270|exactly tallied with that of the deformed person "wanted" for the assault on the Queen.|exactly tallied with that of the deformed person "wanted" for the assault on the Queen.
|
44 |
+
LJ010-0293|I have already remarked that as violence was more and more eliminated from crimes against the person,|I have already remarked that as violence was more and more eliminated from crimes against the person,
|
45 |
+
LJ011-0009|Nothing more was heard of the affair, although the lady declared that she had never instructed Fauntleroy to sell.|Nothing more was heard of the affair, although the lady declared that she had never instructed Fauntleroy to sell.
|
46 |
+
LJ011-0256|By this time the neighbors were aroused, and several people came to the scene of the affray.|By this time the neighbors were aroused, and several people came to the scene of the affray.
|
47 |
+
LJ012-0044|When his trade was busiest he set up a second establishment, at the head of which, although he was married,|When his trade was busiest he set up a second establishment, at the head of which, although he was married,
|
48 |
+
LJ012-0145|Solomons was now also admitted as a witness, and his evidence, with that of Moss, secured the transportation of the principal actors in the theft.|Solomons was now also admitted as a witness, and his evidence, with that of Moss, secured the transportation of the principal actors in the theft.
|
49 |
+
LJ013-0020|he acted in a manner which excited the suspicions of the crew.|he acted in a manner which excited the suspicions of the crew.
|
50 |
+
LJ013-0077|Barber and Fletcher were both transported for life, although Fletcher declared that Barber was innocent, and had no guilty knowledge of what was being done.|Barber and Fletcher were both transported for life, although Fletcher declared that Barber was innocent, and had no guilty knowledge of what was being done.
|
51 |
+
LJ013-0228|In the pocket of the coat Mr. Cope, the governor, found a neatly-folded cloth, and asked what it was for.|In the pocket of the coat Mr. Cope, the governor, found a neatly-folded cloth, and asked what it was for.
|
52 |
+
LJ014-0020|He was soon afterwards arrested on suspicion, and a search of his lodgings brought to light several garments saturated with blood;|He was soon afterwards arrested on suspicion, and a search of his lodgings brought to light several garments saturated with blood;
|
53 |
+
LJ014-0054|a maidservant, Sarah Thomas, murdered her mistress, an aged woman, by beating out her brains with a stone.|a maidservant, Sarah Thomas, murdered her mistress, an aged woman, by beating out her brains with a stone.
|
54 |
+
LJ014-0101|he found that it was soft and new, while elsewhere it was set and hard.|he found that it was soft and new, while elsewhere it was set and hard.
|
55 |
+
LJ014-0103|beneath them was a layer of fresh mortar, beneath that a lot of loose earth, amongst which a stocking was turned up, and presently a human toe.|beneath them was a layer of fresh mortar, beneath that a lot of loose earth, amongst which a stocking was turned up, and presently a human toe.
|
56 |
+
LJ014-0263|When other pleasures palled he took a theatre, and posed as a munificent patron of the dramatic art.|When other pleasures palled he took a theatre, and posed as a munificent patron of the dramatic art.
|
57 |
+
LJ014-0272|and 1850 to embezzle and apply to his own purposes some £71,000.|and eighteen fifty to embezzle and apply to his own purposes some seventy-one thousand pounds.
|
58 |
+
LJ014-0311|His extensive business had been carried on by fraud.|His extensive business had been carried on by fraud.
|
59 |
+
LJ015-0197|which at one time spread terror throughout London. Thieves preferred now to use ingenuity rather than brute force.|which at one time spread terror throughout London. Thieves preferred now to use ingenuity rather than brute force.
|
60 |
+
LJ016-0089|He was engaged in whitewashing and cleaning; the officer who had him in charge left him on the stairs leading to the gallery.|He was engaged in whitewashing and cleaning; the officer who had him in charge left him on the stairs leading to the gallery.
|
61 |
+
LJ016-0407|who generally attended the prison services.|who generally attended the prison services.
|
62 |
+
LJ016-0443|He was promptly rescued from his perilous condition, but not before his face and hands were badly scorched.|He was promptly rescued from his perilous condition, but not before his face and hands were badly scorched.
|
63 |
+
LJ017-0033|a medical practitioner, charged with doing to death persons who relied upon his professional skill.|a medical practitioner, charged with doing to death persons who relied upon his professional skill.
|
64 |
+
LJ017-0038|That the administration of justice should never be interfered with by local prejudice or local feeling|That the administration of justice should never be interfered with by local prejudice or local feeling
|
65 |
+
LJ018-0018|he wore gold-rimmed eye-glasses and a gold watch and chain.|he wore gold-rimmed eye-glasses and a gold watch and chain.
|
66 |
+
LJ018-0119|His offer was not, however, accepted.|His offer was not, however, accepted.
|
67 |
+
LJ018-0280|The commercial experience of these clever rogues was cosmopolitan.|The commercial experience of these clever rogues was cosmopolitan.
|
68 |
+
LJ019-0178|and abandoned because of the expense. As to the entire reconstruction of Newgate, nothing had been done as yet.|and abandoned because of the expense. As to the entire reconstruction of Newgate, nothing had been done as yet.
|
69 |
+
LJ019-0240|But no structural alterations were made from the date first quoted until the time of closing the prison in 1881.|But no structural alterations were made from the date first quoted until the time of closing the prison in eighteen eighty-one.
|
70 |
+
LJ021-0049|and the curtailment of rank stock speculation through the Securities Exchange Act.|and the curtailment of rank stock speculation through the Securities Exchange Act.
|
71 |
+
LJ021-0155|both directly on the public works themselves, and indirectly in the industries supplying the materials for these public works.|both directly on the public works themselves, and indirectly in the industries supplying the materials for these public works.
|
72 |
+
LJ022-0046|It is true that while business and industry are definitely better our relief rolls are still too large.|It is true that while business and industry are definitely better our relief rolls are still too large.
|
73 |
+
LJ022-0173|for the regulation of transportation by water, for the strengthening of our Merchant Marine and Air Transport,|for the regulation of transportation by water, for the strengthening of our Merchant Marine and Air Transport,
|
74 |
+
LJ024-0087|I have thus explained to you the reasons that lie behind our efforts to secure results by legislation within the Constitution.|I have thus explained to you the reasons that lie behind our efforts to secure results by legislation within the Constitution.
|
75 |
+
LJ024-0110|And the strategy of that last stand is to suggest the time-consuming process of amendment in order to kill off by delay|And the strategy of that last stand is to suggest the time-consuming process of amendment in order to kill off by delay
|
76 |
+
LJ024-0119|When before have you found them really at your side in your fights for progress?|When before have you found them really at your side in your fights for progress?
|
77 |
+
LJ025-0091|as it was current among contemporary chemists.|as it was current among contemporary chemists.
|
78 |
+
LJ026-0029|so in the case under discussion.|so in the case under discussion.
|
79 |
+
LJ026-0039|the earliest organisms were protists and that from them animals and plants were evolved along divergent lines of descent.|the earliest organisms were protists and that from them animals and plants were evolved along divergent lines of descent.
|
80 |
+
LJ026-0064|but unlike that of the animal, it is not chiefly an income of foods, but only of the raw materials of food.|but unlike that of the animal, it is not chiefly an income of foods, but only of the raw materials of food.
|
81 |
+
LJ026-0105|This is done by diastase, an enzyme of plant cells.|This is done by diastase, an enzyme of plant cells.
|
82 |
+
LJ026-0137|and be laid down as "reserve starch" in the cells of root or stem or elsewhere.|and be laid down as "reserve starch" in the cells of root or stem or elsewhere.
|
83 |
+
LJ027-0006|In all these lines the facts are drawn together by a strong thread of unity.|In all these lines the facts are drawn together by a strong thread of unity.
|
84 |
+
LJ028-0134|He also erected what is called a pensile paradise:|He also erected what is called a pensile paradise:
|
85 |
+
LJ028-0138|perhaps the tales that travelers told him were exaggerated as travelers' tales are likely to be,|perhaps the tales that travelers told him were exaggerated as travelers' tales are likely to be,
|
86 |
+
LJ028-0189|The fall of Babylon with its lofty walls was a most important event in the history of the ancient world.|The fall of Babylon with its lofty walls was a most important event in the history of the ancient world.
|
87 |
+
LJ028-0281|Till mules foal ye shall not take our city, he thought, as he reflected on this speech, that Babylon might now be taken,|Till mules foal ye shall not take our city, he thought, as he reflected on this speech, that Babylon might now be taken,
|
88 |
+
LJ029-0188|Stevenson was jeered, jostled, and spat upon by hostile demonstrators outside the Dallas Memorial Auditorium Theater.|Stevenson was jeered, jostled, and spat upon by hostile demonstrators outside the Dallas Memorial Auditorium Theater.
|
89 |
+
LJ030-0098|The remainder of the motorcade consisted of five cars for other dignitaries, including the mayor of Dallas and Texas Congressmen,|The remainder of the motorcade consisted of five cars for other dignitaries, including the mayor of Dallas and Texas Congressmen,
|
90 |
+
LJ031-0007|Chief of Police Curry and police motorcyclists at the head of the motorcade led the way to the hospital.|Chief of Police Curry and police motorcyclists at the head of the motorcade led the way to the hospital.
|
91 |
+
LJ031-0091|You have to determine which things, which are immediately life threatening and cope with them, before attempting to evaluate the full extent of the injuries.|You have to determine which things, which are immediately life threatening and cope with them, before attempting to evaluate the full extent of the injuries.
|
92 |
+
LJ031-0227|The doctors traced the course of the bullet through the body and, as information was received from Parkland Hospital,|The doctors traced the course of the bullet through the body and, as information was received from Parkland Hospital,
|
93 |
+
LJ032-0100|Marina Oswald|Marina Oswald
|
94 |
+
LJ032-0165|to the exclusion of all others because there are not enough microscopic characteristics present in fibers.|to the exclusion of all others because there are not enough microscopic characteristics present in fibers.
|
95 |
+
LJ032-0198|During the period from March 2, 1963, to April 24, 1963,|During the period from March two, nineteen sixty-three, to April twenty-four, nineteen sixty-three,
|
96 |
+
LJ033-0046|went out to the garage to paint some children's blocks, and worked in the garage for half an hour or so.|went out to the garage to paint some children's blocks, and worked in the garage for half an hour or so.
|
97 |
+
LJ033-0072|I then stepped off of it and the officer picked it up in the middle and it bent so.|I then stepped off of it and the officer picked it up in the middle and it bent so.
|
98 |
+
LJ033-0135|Location of Bag|Location of Bag
|
99 |
+
LJ034-0083|The significance of Givens' observation that Oswald was carrying his clipboard|The significance of Givens' observation that Oswald was carrying his clipboard
|
100 |
+
LJ034-0179|and, quote, seemed to be sitting a little forward, end quote,|and, quote, seemed to be sitting a little forward, end quote,
|
101 |
+
LJ035-0125|Victoria Adams, who worked on the fourth floor of the Depository Building,|Victoria Adams, who worked on the fourth floor of the Depository Building,
|
102 |
+
LJ035-0162|approximately 30 to 45 seconds after Oswald's lunchroom encounter with Baker and Truly.|approximately thirty to forty-five seconds after Oswald's lunchroom encounter with Baker and Truly.
|
103 |
+
LJ035-0189|Special Agent Forrest V. Sorrels of the Secret Service, who had been in the motorcade,|Special Agent Forrest V. Sorrels of the Secret Service, who had been in the motorcade,
|
104 |
+
LJ035-0208|Oswald's known actions in the building immediately after the assassination are consistent with his having been at the southeast corner window of the sixth floor|Oswald's known actions in the building immediately after the assassination are consistent with his having been at the southeast corner window of the sixth floor
|
105 |
+
LJ036-0216|Tippit got out and started to walk around the front of the car|Tippit got out and started to walk around the front of the car
|
106 |
+
LJ037-0093|William Arthur Smith was about a block east of 10th and Patton when he heard shots.|William Arthur Smith was about a block east of tenth and Patton when he heard shots.
|
107 |
+
LJ037-0157|taken from Oswald.|taken from Oswald.
|
108 |
+
LJ037-0178|or one used Remington-Peters cartridge case, which may have been in the revolver before the shooting,|or one used Remington-Peters cartridge case, which may have been in the revolver before the shooting,
|
109 |
+
LJ037-0219|Oswald's Jacket|Oswald's Jacket
|
110 |
+
LJ037-0222|When Oswald was arrested, he did not have a jacket.|When Oswald was arrested, he did not have a jacket.
|
111 |
+
LJ038-0017|Attracted by the sound of the sirens, Mrs. Postal stepped out of the box office and walked to the curb.|Attracted by the sound of the sirens, Mrs. Postal stepped out of the box office and walked to the curb.
|
112 |
+
LJ038-0052|testified regarding the arrest of Oswald, as did the various police officers who participated in the fight.|testified regarding the arrest of Oswald, as did the various police officers who participated in the fight.
|
113 |
+
LJ038-0077|Statements of Oswald during Detention.|Statements of Oswald during Detention.
|
114 |
+
LJ038-0161|and he asked me did I know which way he was coming, and I told him, yes, he probably come down Main and turn on Houston and then back again on Elm.|and he asked me did I know which way he was coming, and I told him, yes, he probably come down Main and turn on Houston and then back again on Elm.
|
115 |
+
LJ038-0212|which appeared to be the work of a man expecting to be killed, or imprisoned, or to disappear.|which appeared to be the work of a man expecting to be killed, or imprisoned, or to disappear.
|
116 |
+
LJ039-0103|Oswald, like all Marine recruits, received training on the rifle range at distances up to 500 yards,|Oswald, like all Marine recruits, received training on the rifle range at distances up to five hundred yards,
|
117 |
+
LJ039-0149|established that they had been previously loaded and ejected from the assassination rifle,|established that they had been previously loaded and ejected from the assassination rifle,
|
118 |
+
LJ040-0107|but apparently was not able to spend as much time with them as he would have liked, because of the age gaps of 5 and 7 years,|but apparently was not able to spend as much time with them as he would have liked, because of the age gaps of five and seven years,
|
119 |
+
LJ040-0119|When Pic returned home, Mrs. Oswald tried to play down the event but Mrs. Pic took a different view and asked the Oswalds to leave.|When Pic returned home, Mrs. Oswald tried to play down the event but Mrs. Pic took a different view and asked the Oswalds to leave.
|
120 |
+
LJ040-0161|Dr. Hartogs recommended that Oswald be placed on probation on condition that he seek help and guidance through a child guidance clinic.|Dr. Hartogs recommended that Oswald be placed on probation on condition that he seek help and guidance through a child guidance clinic.
|
121 |
+
LJ040-0169|She observed that since Lee's mother worked all day, he made his own meals and spent all his time alone|She observed that since Lee's mother worked all day, he made his own meals and spent all his time alone
|
122 |
+
LJ041-0098|All the Marine Corps did was to teach you to kill and after you got out of the Marines you might be good gangsters, end quote.|All the Marine Corps did was to teach you to kill and after you got out of the Marines you might be good gangsters, end quote.
|
123 |
+
LJ042-0017|and see for himself how a revolutionary society operates, a Marxist society.|and see for himself how a revolutionary society operates, a Marxist society.
|
124 |
+
LJ042-0070|Oswald was discovered in time to thwart his attempt at suicide.|Oswald was discovered in time to thwart his attempt at suicide.
|
125 |
+
LJ042-0161|Immediately after serving out his 3 years in the U.S. Marine Corps, he abandoned his American life to seek a new life in the USSR.|Immediately after serving out his three years in the U.S. Marine Corps, he abandoned his American life to seek a new life in the USSR.
|
126 |
+
LJ043-0147|He had left a note for his wife telling her what to do in case he were apprehended, as well as his notebook and the pictures of himself holding the rifle.|He had left a note for his wife telling her what to do in case he were apprehended, as well as his notebook and the pictures of himself holding the rifle.
|
127 |
+
LJ043-0178|as, in fact, one of them did appear after the assassination.|as, in fact, one of them did appear after the assassination.
|
128 |
+
LJ043-0183|Oswald did not lack the determination and other traits required|Oswald did not lack the determination and other traits required
|
129 |
+
LJ043-0185|Some idea of what he thought was sufficient reason for such an act may be found in the nature of the motive that he stated for his attack on General Walker.|Some idea of what he thought was sufficient reason for such an act may be found in the nature of the motive that he stated for his attack on General Walker.
|
130 |
+
LJ044-0057|extensive investigation was not able to connect Oswald with that address, although it did develop the fact|extensive investigation was not able to connect Oswald with that address, although it did develop the fact
|
131 |
+
LJ044-0109|It is good to know that movements in support of fair play for Cuba has developed in New Orleans as well as in other cities.|It is good to know that movements in support of fair play for Cuba has developed in New Orleans as well as in other cities.
|
132 |
+
LJ045-0081|Although she denied it in some of her testimony before the Commission,|Although she denied it in some of her testimony before the Commission,
|
133 |
+
LJ045-0147|She asked Oswald, quote,|She asked Oswald, quote,
|
134 |
+
LJ045-0204|he had never found anything to which he felt he could really belong.|he had never found anything to which he felt he could really belong.
|
135 |
+
LJ046-0193|and 12 to 15 of these cases as highly dangerous risks.|and twelve to fifteen of these cases as highly dangerous risks.
|
136 |
+
LJ046-0244|PRS should have investigated and been prepared to guard against it.|PRS should have investigated and been prepared to guard against it.
|
137 |
+
LJ047-0059|However, pursuant to a regular Bureau practice of interviewing certain immigrants from Iron Curtain countries,|However, pursuant to a regular Bureau practice of interviewing certain immigrants from Iron Curtain countries,
|
138 |
+
LJ047-0142|The Bureau had no earlier information suggesting that Oswald had left the United States.|The Bureau had no earlier information suggesting that Oswald had left the United States.
|
139 |
+
LJ048-0035|It was against this background and consistent with the criteria followed by the FBI prior to November 22|It was against this background and consistent with the criteria followed by the FBI prior to November twenty-two
|
140 |
+
LJ048-0063|The formal FBI instructions to its agents outlining the information to be referred to the Secret Service were too narrow at the time of the assassination.|The formal FBI instructions to its agents outlining the information to be referred to the Secret Service were too narrow at the time of the assassination.
|
141 |
+
LJ048-0104|There were far safer routes via freeways directly to the Trade Mart,|There were far safer routes via freeways directly to the Trade Mart,
|
142 |
+
LJ048-0187|In addition, Secret Service agents riding in the motorcade were trained to scan buildings as part of their general observation of the crowd of spectators.|In addition, Secret Service agents riding in the motorcade were trained to scan buildings as part of their general observation of the crowd of spectators.
|
143 |
+
LJ048-0271|will be cause for removal from the Service, end quote.|will be cause for removal from the Service, end quote.
|
144 |
+
LJ049-0031|The Presidential vehicle in use in Dallas, described in chapter 2,|The Presidential vehicle in use in Dallas, described in chapter two,
|
145 |
+
LJ049-0059|Agents are instructed that it is not their responsibility to investigate or evaluate a present danger,|Agents are instructed that it is not their responsibility to investigate or evaluate a present danger,
|
146 |
+
LJ049-0174|to notify the Secret Service of the substantial information about Lee Harvey Oswald which the FBI had accumulated|to notify the Secret Service of the substantial information about Lee Harvey Oswald which the FBI had accumulated
|
147 |
+
LJ050-0049|and from a specialist in psychiatric prognostication at Walter Reed Hospital.|and from a specialist in psychiatric prognostication at Walter Reed Hospital.
|
148 |
+
LJ050-0113|Such agreements should describe in detail the information which is sought, the manner in which it will be provided to the Secret Service,|Such agreements should describe in detail the information which is sought, the manner in which it will be provided to the Secret Service,
|
149 |
+
LJ050-0150|Its present manual filing system is obsolete;|Its present manual filing system is obsolete;
|
150 |
+
LJ050-0189|that written instructions might come into the hands of local newspapers, to the prejudice of the precautions described.|that written instructions might come into the hands of local newspapers, to the prejudice of the precautions described.
|
hifigan/README.md
ADDED
@@ -0,0 +1,105 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# HiFi-GAN: Generative Adversarial Networks for Efficient and High Fidelity Speech Synthesis
|
2 |
+
|
3 |
+
### Jungil Kong, Jaehyeon Kim, Jaekyoung Bae
|
4 |
+
|
5 |
+
In our [paper](https://arxiv.org/abs/2010.05646),
|
6 |
+
we proposed HiFi-GAN: a GAN-based model capable of generating high fidelity speech efficiently.<br/>
|
7 |
+
We provide our implementation and pretrained models as open source in this repository.
|
8 |
+
|
9 |
+
**Abstract :**
|
10 |
+
Several recent work on speech synthesis have employed generative adversarial networks (GANs) to produce raw waveforms.
|
11 |
+
Although such methods improve the sampling efficiency and memory usage,
|
12 |
+
their sample quality has not yet reached that of autoregressive and flow-based generative models.
|
13 |
+
In this work, we propose HiFi-GAN, which achieves both efficient and high-fidelity speech synthesis.
|
14 |
+
As speech audio consists of sinusoidal signals with various periods,
|
15 |
+
we demonstrate that modeling periodic patterns of an audio is crucial for enhancing sample quality.
|
16 |
+
A subjective human evaluation (mean opinion score, MOS) of a single speaker dataset indicates that our proposed method
|
17 |
+
demonstrates similarity to human quality while generating 22.05 kHz high-fidelity audio 167.9 times faster than
|
18 |
+
real-time on a single V100 GPU. We further show the generality of HiFi-GAN to the mel-spectrogram inversion of unseen
|
19 |
+
speakers and end-to-end speech synthesis. Finally, a small footprint version of HiFi-GAN generates samples 13.4 times
|
20 |
+
faster than real-time on CPU with comparable quality to an autoregressive counterpart.
|
21 |
+
|
22 |
+
Visit our [demo website](https://jik876.github.io/hifi-gan-demo/) for audio samples.
|
23 |
+
|
24 |
+
|
25 |
+
## Pre-requisites
|
26 |
+
1. Python >= 3.6
|
27 |
+
2. Clone this repository.
|
28 |
+
3. Install python requirements. Please refer [requirements.txt](requirements.txt)
|
29 |
+
4. Download and extract the [LJ Speech dataset](https://keithito.com/LJ-Speech-Dataset/).
|
30 |
+
And move all wav files to `LJSpeech-1.1/wavs`
|
31 |
+
|
32 |
+
|
33 |
+
## Training
|
34 |
+
```
|
35 |
+
python train.py --config config_v1.json
|
36 |
+
```
|
37 |
+
To train V2 or V3 Generator, replace `config_v1.json` with `config_v2.json` or `config_v3.json`.<br>
|
38 |
+
Checkpoints and copy of the configuration file are saved in `cp_hifigan` directory by default.<br>
|
39 |
+
You can change the path by adding `--checkpoint_path` option.
|
40 |
+
|
41 |
+
Validation loss during training with V1 generator.<br>
|
42 |
+
![validation loss](./validation_loss.png)
|
43 |
+
|
44 |
+
## Pretrained Model
|
45 |
+
You can also use pretrained models we provide.<br/>
|
46 |
+
[Download pretrained models](https://drive.google.com/drive/folders/1-eEYTB5Av9jNql0WGBlRoi-WH2J7bp5Y?usp=sharing)<br/>
|
47 |
+
Details of each folder are as in follows:
|
48 |
+
|
49 |
+
|Folder Name|Generator|Dataset|Fine-Tuned|
|
50 |
+
|------|---|---|---|
|
51 |
+
|LJ_V1|V1|LJSpeech|No|
|
52 |
+
|LJ_V2|V2|LJSpeech|No|
|
53 |
+
|LJ_V3|V3|LJSpeech|No|
|
54 |
+
|LJ_FT_T2_V1|V1|LJSpeech|Yes ([Tacotron2](https://github.com/NVIDIA/tacotron2))|
|
55 |
+
|LJ_FT_T2_V2|V2|LJSpeech|Yes ([Tacotron2](https://github.com/NVIDIA/tacotron2))|
|
56 |
+
|LJ_FT_T2_V3|V3|LJSpeech|Yes ([Tacotron2](https://github.com/NVIDIA/tacotron2))|
|
57 |
+
|VCTK_V1|V1|VCTK|No|
|
58 |
+
|VCTK_V2|V2|VCTK|No|
|
59 |
+
|VCTK_V3|V3|VCTK|No|
|
60 |
+
|UNIVERSAL_V1|V1|Universal|No|
|
61 |
+
|
62 |
+
We provide the universal model with discriminator weights that can be used as a base for transfer learning to other datasets.
|
63 |
+
|
64 |
+
## Fine-Tuning
|
65 |
+
1. Generate mel-spectrograms in numpy format using [Tacotron2](https://github.com/NVIDIA/tacotron2) with teacher-forcing.<br/>
|
66 |
+
The file name of the generated mel-spectrogram should match the audio file and the extension should be `.npy`.<br/>
|
67 |
+
Example:
|
68 |
+
```
|
69 |
+
Audio File : LJ001-0001.wav
|
70 |
+
Mel-Spectrogram File : LJ001-0001.npy
|
71 |
+
```
|
72 |
+
2. Create `ft_dataset` folder and copy the generated mel-spectrogram files into it.<br/>
|
73 |
+
3. Run the following command.
|
74 |
+
```
|
75 |
+
python train.py --fine_tuning True --config config_v1.json
|
76 |
+
```
|
77 |
+
For other command line options, please refer to the training section.
|
78 |
+
|
79 |
+
|
80 |
+
## Inference from wav file
|
81 |
+
1. Make `test_files` directory and copy wav files into the directory.
|
82 |
+
2. Run the following command.
|
83 |
+
```
|
84 |
+
python inference.py --checkpoint_file [generator checkpoint file path]
|
85 |
+
```
|
86 |
+
Generated wav files are saved in `generated_files` by default.<br>
|
87 |
+
You can change the path by adding `--output_dir` option.
|
88 |
+
|
89 |
+
|
90 |
+
## Inference for end-to-end speech synthesis
|
91 |
+
1. Make `test_mel_files` directory and copy generated mel-spectrogram files into the directory.<br>
|
92 |
+
You can generate mel-spectrograms using [Tacotron2](https://github.com/NVIDIA/tacotron2),
|
93 |
+
[Glow-TTS](https://github.com/jaywalnut310/glow-tts) and so forth.
|
94 |
+
2. Run the following command.
|
95 |
+
```
|
96 |
+
python inference_e2e.py --checkpoint_file [generator checkpoint file path]
|
97 |
+
```
|
98 |
+
Generated wav files are saved in `generated_files_from_mel` by default.<br>
|
99 |
+
You can change the path by adding `--output_dir` option.
|
100 |
+
|
101 |
+
|
102 |
+
## Acknowledgements
|
103 |
+
We referred to [WaveGlow](https://github.com/NVIDIA/waveglow), [MelGAN](https://github.com/descriptinc/melgan-neurips)
|
104 |
+
and [Tacotron2](https://github.com/NVIDIA/tacotron2) to implement this.
|
105 |
+
|
hifigan/__pycache__/env.cpython-310.pyc
ADDED
Binary file (840 Bytes). View file
|
|
hifigan/__pycache__/models.cpython-310.pyc
ADDED
Binary file (8.7 kB). View file
|
|
hifigan/__pycache__/utils.cpython-310.pyc
ADDED
Binary file (2.05 kB). View file
|
|
hifigan/config_v1.json
ADDED
@@ -0,0 +1,37 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"resblock": "1",
|
3 |
+
"num_gpus": 0,
|
4 |
+
"batch_size": 16,
|
5 |
+
"learning_rate": 0.0002,
|
6 |
+
"adam_b1": 0.8,
|
7 |
+
"adam_b2": 0.99,
|
8 |
+
"lr_decay": 0.999,
|
9 |
+
"seed": 1234,
|
10 |
+
|
11 |
+
"upsample_rates": [8,8,2,2],
|
12 |
+
"upsample_kernel_sizes": [16,16,4,4],
|
13 |
+
"upsample_initial_channel": 512,
|
14 |
+
"resblock_kernel_sizes": [3,7,11],
|
15 |
+
"resblock_dilation_sizes": [[1,3,5], [1,3,5], [1,3,5]],
|
16 |
+
|
17 |
+
"segment_size": 8192,
|
18 |
+
"num_mels": 80,
|
19 |
+
"num_freq": 1025,
|
20 |
+
"n_fft": 1024,
|
21 |
+
"hop_size": 256,
|
22 |
+
"win_size": 1024,
|
23 |
+
|
24 |
+
"sampling_rate": 22050,
|
25 |
+
|
26 |
+
"fmin": 0,
|
27 |
+
"fmax": 8000,
|
28 |
+
"fmax_for_loss": null,
|
29 |
+
|
30 |
+
"num_workers": 4,
|
31 |
+
|
32 |
+
"dist_config": {
|
33 |
+
"dist_backend": "nccl",
|
34 |
+
"dist_url": "tcp://localhost:54321",
|
35 |
+
"world_size": 1
|
36 |
+
}
|
37 |
+
}
|
hifigan/env.py
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import os
|
2 |
+
import shutil
|
3 |
+
|
4 |
+
|
5 |
+
class AttrDict(dict):
|
6 |
+
def __init__(self, *args, **kwargs):
|
7 |
+
super(AttrDict, self).__init__(*args, **kwargs)
|
8 |
+
self.__dict__ = self
|
9 |
+
|
10 |
+
|
11 |
+
def build_env(config, config_name, path):
|
12 |
+
t_path = os.path.join(path, config_name)
|
13 |
+
if config != t_path:
|
14 |
+
os.makedirs(path, exist_ok=True)
|
15 |
+
shutil.copyfile(config, os.path.join(path, config_name))
|
hifigan/inference.py
ADDED
@@ -0,0 +1,95 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from __future__ import absolute_import, division, print_function, unicode_literals
|
2 |
+
|
3 |
+
import glob
|
4 |
+
import os
|
5 |
+
import argparse
|
6 |
+
import json
|
7 |
+
import torch
|
8 |
+
from scipy.io.wavfile import write
|
9 |
+
from env import AttrDict
|
10 |
+
from meldataset import mel_spectrogram, MAX_WAV_VALUE, load_wav
|
11 |
+
from models import Generator
|
12 |
+
|
13 |
+
h = None
|
14 |
+
device = None
|
15 |
+
|
16 |
+
|
17 |
+
def load_checkpoint(filepath, device):
|
18 |
+
assert os.path.isfile(filepath)
|
19 |
+
print("Loading '{}'".format(filepath))
|
20 |
+
checkpoint_dict = torch.load(filepath, map_location=device)
|
21 |
+
print("Complete.")
|
22 |
+
return checkpoint_dict
|
23 |
+
|
24 |
+
|
25 |
+
def get_mel(x):
|
26 |
+
return mel_spectrogram(x, h.n_fft, h.num_mels, h.sampling_rate, h.hop_size, h.win_size, h.fmin, h.fmax)
|
27 |
+
|
28 |
+
|
29 |
+
def scan_checkpoint(cp_dir, prefix):
|
30 |
+
pattern = os.path.join(cp_dir, prefix + '*')
|
31 |
+
cp_list = glob.glob(pattern)
|
32 |
+
if len(cp_list) == 0:
|
33 |
+
return ''
|
34 |
+
return sorted(cp_list)[-1]
|
35 |
+
|
36 |
+
|
37 |
+
def inference(a):
|
38 |
+
generator = Generator(h).to(device)
|
39 |
+
|
40 |
+
state_dict_g = load_checkpoint(a.checkpoint_file, device)
|
41 |
+
generator.load_state_dict(state_dict_g['generator'])
|
42 |
+
|
43 |
+
filelist = os.listdir(a.input_wavs_dir)
|
44 |
+
|
45 |
+
os.makedirs(a.output_dir, exist_ok=True)
|
46 |
+
|
47 |
+
generator.eval()
|
48 |
+
generator.remove_weight_norm()
|
49 |
+
with torch.no_grad():
|
50 |
+
for i, filname in enumerate(filelist):
|
51 |
+
wav, sr = load_wav(os.path.join(a.input_wavs_dir, filname))
|
52 |
+
wav = wav / MAX_WAV_VALUE
|
53 |
+
wav = torch.FloatTensor(wav).to(device)
|
54 |
+
x = get_mel(wav.unsqueeze(0))
|
55 |
+
y_g_hat = generator(x)
|
56 |
+
audio = y_g_hat.squeeze()
|
57 |
+
audio = audio * MAX_WAV_VALUE
|
58 |
+
audio = audio.cpu().numpy().astype('int16')
|
59 |
+
|
60 |
+
output_file = os.path.join(a.output_dir, os.path.splitext(filname)[0] + '_generated.wav')
|
61 |
+
write(output_file, h.sampling_rate, audio)
|
62 |
+
print(output_file)
|
63 |
+
|
64 |
+
|
65 |
+
def main():
|
66 |
+
print('Initializing Inference Process..')
|
67 |
+
|
68 |
+
parser = argparse.ArgumentParser()
|
69 |
+
parser.add_argument('--input_wavs_dir', default='test_files')
|
70 |
+
parser.add_argument('--output_dir', default='generated_files')
|
71 |
+
parser.add_argument('--checkpoint_file', required=True)
|
72 |
+
a = parser.parse_args()
|
73 |
+
|
74 |
+
config_file = os.path.join(os.path.split(a.checkpoint_file)[0], 'config.json')
|
75 |
+
with open(config_file) as f:
|
76 |
+
data = f.read()
|
77 |
+
|
78 |
+
global h
|
79 |
+
json_config = json.loads(data)
|
80 |
+
h = AttrDict(json_config)
|
81 |
+
|
82 |
+
torch.manual_seed(h.seed)
|
83 |
+
global device
|
84 |
+
if torch.cuda.is_available():
|
85 |
+
torch.cuda.manual_seed(h.seed)
|
86 |
+
device = torch.device('cuda')
|
87 |
+
else:
|
88 |
+
device = torch.device('cpu')
|
89 |
+
|
90 |
+
inference(a)
|
91 |
+
|
92 |
+
|
93 |
+
if __name__ == '__main__':
|
94 |
+
main()
|
95 |
+
|
hifigan/meldataset.py
ADDED
@@ -0,0 +1,168 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import math
|
2 |
+
import os
|
3 |
+
import random
|
4 |
+
import torch
|
5 |
+
import torch.utils.data
|
6 |
+
import numpy as np
|
7 |
+
from librosa.util import normalize
|
8 |
+
from scipy.io.wavfile import read
|
9 |
+
from librosa.filters import mel as librosa_mel_fn
|
10 |
+
|
11 |
+
MAX_WAV_VALUE = 32768.0
|
12 |
+
|
13 |
+
|
14 |
+
def load_wav(full_path):
|
15 |
+
sampling_rate, data = read(full_path)
|
16 |
+
return data, sampling_rate
|
17 |
+
|
18 |
+
|
19 |
+
def dynamic_range_compression(x, C=1, clip_val=1e-5):
|
20 |
+
return np.log(np.clip(x, a_min=clip_val, a_max=None) * C)
|
21 |
+
|
22 |
+
|
23 |
+
def dynamic_range_decompression(x, C=1):
|
24 |
+
return np.exp(x) / C
|
25 |
+
|
26 |
+
|
27 |
+
def dynamic_range_compression_torch(x, C=1, clip_val=1e-5):
|
28 |
+
return torch.log(torch.clamp(x, min=clip_val) * C)
|
29 |
+
|
30 |
+
|
31 |
+
def dynamic_range_decompression_torch(x, C=1):
|
32 |
+
return torch.exp(x) / C
|
33 |
+
|
34 |
+
|
35 |
+
def spectral_normalize_torch(magnitudes):
|
36 |
+
output = dynamic_range_compression_torch(magnitudes)
|
37 |
+
return output
|
38 |
+
|
39 |
+
|
40 |
+
def spectral_de_normalize_torch(magnitudes):
|
41 |
+
output = dynamic_range_decompression_torch(magnitudes)
|
42 |
+
return output
|
43 |
+
|
44 |
+
|
45 |
+
mel_basis = {}
|
46 |
+
hann_window = {}
|
47 |
+
|
48 |
+
|
49 |
+
def mel_spectrogram(y, n_fft, num_mels, sampling_rate, hop_size, win_size, fmin, fmax, center=False):
|
50 |
+
if torch.min(y) < -1.:
|
51 |
+
print('min value is ', torch.min(y))
|
52 |
+
if torch.max(y) > 1.:
|
53 |
+
print('max value is ', torch.max(y))
|
54 |
+
|
55 |
+
global mel_basis, hann_window
|
56 |
+
if fmax not in mel_basis:
|
57 |
+
mel = librosa_mel_fn(sampling_rate, n_fft, num_mels, fmin, fmax)
|
58 |
+
mel_basis[str(fmax)+'_'+str(y.device)] = torch.from_numpy(mel).float().to(y.device)
|
59 |
+
hann_window[str(y.device)] = torch.hann_window(win_size).to(y.device)
|
60 |
+
|
61 |
+
y = torch.nn.functional.pad(y.unsqueeze(1), (int((n_fft-hop_size)/2), int((n_fft-hop_size)/2)), mode='reflect')
|
62 |
+
y = y.squeeze(1)
|
63 |
+
|
64 |
+
spec = torch.stft(y, n_fft, hop_length=hop_size, win_length=win_size, window=hann_window[str(y.device)],
|
65 |
+
center=center, pad_mode='reflect', normalized=False, onesided=True)
|
66 |
+
|
67 |
+
spec = torch.sqrt(spec.pow(2).sum(-1)+(1e-9))
|
68 |
+
|
69 |
+
spec = torch.matmul(mel_basis[str(fmax)+'_'+str(y.device)], spec)
|
70 |
+
spec = spectral_normalize_torch(spec)
|
71 |
+
|
72 |
+
return spec
|
73 |
+
|
74 |
+
|
75 |
+
def get_dataset_filelist(a):
|
76 |
+
with open(a.input_training_file, 'r', encoding='utf-8') as fi:
|
77 |
+
training_files = [os.path.join(a.input_wavs_dir, x.split('|')[0] + '.wav')
|
78 |
+
for x in fi.read().split('\n') if len(x) > 0]
|
79 |
+
|
80 |
+
with open(a.input_validation_file, 'r', encoding='utf-8') as fi:
|
81 |
+
validation_files = [os.path.join(a.input_wavs_dir, x.split('|')[0] + '.wav')
|
82 |
+
for x in fi.read().split('\n') if len(x) > 0]
|
83 |
+
return training_files, validation_files
|
84 |
+
|
85 |
+
|
86 |
+
class MelDataset(torch.utils.data.Dataset):
|
87 |
+
def __init__(self, training_files, segment_size, n_fft, num_mels,
|
88 |
+
hop_size, win_size, sampling_rate, fmin, fmax, split=True, shuffle=True, n_cache_reuse=1,
|
89 |
+
device=None, fmax_loss=None, fine_tuning=False, base_mels_path=None):
|
90 |
+
self.audio_files = training_files
|
91 |
+
random.seed(1234)
|
92 |
+
if shuffle:
|
93 |
+
random.shuffle(self.audio_files)
|
94 |
+
self.segment_size = segment_size
|
95 |
+
self.sampling_rate = sampling_rate
|
96 |
+
self.split = split
|
97 |
+
self.n_fft = n_fft
|
98 |
+
self.num_mels = num_mels
|
99 |
+
self.hop_size = hop_size
|
100 |
+
self.win_size = win_size
|
101 |
+
self.fmin = fmin
|
102 |
+
self.fmax = fmax
|
103 |
+
self.fmax_loss = fmax_loss
|
104 |
+
self.cached_wav = None
|
105 |
+
self.n_cache_reuse = n_cache_reuse
|
106 |
+
self._cache_ref_count = 0
|
107 |
+
self.device = device
|
108 |
+
self.fine_tuning = fine_tuning
|
109 |
+
self.base_mels_path = base_mels_path
|
110 |
+
|
111 |
+
def __getitem__(self, index):
|
112 |
+
filename = self.audio_files[index]
|
113 |
+
if self._cache_ref_count == 0:
|
114 |
+
audio, sampling_rate = load_wav(filename)
|
115 |
+
audio = audio / MAX_WAV_VALUE
|
116 |
+
if not self.fine_tuning:
|
117 |
+
audio = normalize(audio) * 0.95
|
118 |
+
self.cached_wav = audio
|
119 |
+
if sampling_rate != self.sampling_rate:
|
120 |
+
raise ValueError("{} SR doesn't match target {} SR".format(
|
121 |
+
sampling_rate, self.sampling_rate))
|
122 |
+
self._cache_ref_count = self.n_cache_reuse
|
123 |
+
else:
|
124 |
+
audio = self.cached_wav
|
125 |
+
self._cache_ref_count -= 1
|
126 |
+
|
127 |
+
audio = torch.FloatTensor(audio)
|
128 |
+
audio = audio.unsqueeze(0)
|
129 |
+
|
130 |
+
if not self.fine_tuning:
|
131 |
+
if self.split:
|
132 |
+
if audio.size(1) >= self.segment_size:
|
133 |
+
max_audio_start = audio.size(1) - self.segment_size
|
134 |
+
audio_start = random.randint(0, max_audio_start)
|
135 |
+
audio = audio[:, audio_start:audio_start+self.segment_size]
|
136 |
+
else:
|
137 |
+
audio = torch.nn.functional.pad(audio, (0, self.segment_size - audio.size(1)), 'constant')
|
138 |
+
|
139 |
+
mel = mel_spectrogram(audio, self.n_fft, self.num_mels,
|
140 |
+
self.sampling_rate, self.hop_size, self.win_size, self.fmin, self.fmax,
|
141 |
+
center=False)
|
142 |
+
else:
|
143 |
+
mel = np.load(
|
144 |
+
os.path.join(self.base_mels_path, os.path.splitext(os.path.split(filename)[-1])[0] + '.npy'))
|
145 |
+
mel = torch.from_numpy(mel)
|
146 |
+
|
147 |
+
if len(mel.shape) < 3:
|
148 |
+
mel = mel.unsqueeze(0)
|
149 |
+
|
150 |
+
if self.split:
|
151 |
+
frames_per_seg = math.ceil(self.segment_size / self.hop_size)
|
152 |
+
|
153 |
+
if audio.size(1) >= self.segment_size:
|
154 |
+
mel_start = random.randint(0, mel.size(2) - frames_per_seg - 1)
|
155 |
+
mel = mel[:, :, mel_start:mel_start + frames_per_seg]
|
156 |
+
audio = audio[:, mel_start * self.hop_size:(mel_start + frames_per_seg) * self.hop_size]
|
157 |
+
else:
|
158 |
+
mel = torch.nn.functional.pad(mel, (0, frames_per_seg - mel.size(2)), 'constant')
|
159 |
+
audio = torch.nn.functional.pad(audio, (0, self.segment_size - audio.size(1)), 'constant')
|
160 |
+
|
161 |
+
mel_loss = mel_spectrogram(audio, self.n_fft, self.num_mels,
|
162 |
+
self.sampling_rate, self.hop_size, self.win_size, self.fmin, self.fmax_loss,
|
163 |
+
center=False)
|
164 |
+
|
165 |
+
return (mel.squeeze(), audio.squeeze(0), filename, mel_loss.squeeze())
|
166 |
+
|
167 |
+
def __len__(self):
|
168 |
+
return len(self.audio_files)
|
hifigan/models.py
ADDED
@@ -0,0 +1,283 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import torch
|
2 |
+
import torch.nn.functional as F
|
3 |
+
import torch.nn as nn
|
4 |
+
from torch.nn import Conv1d, ConvTranspose1d, AvgPool1d, Conv2d
|
5 |
+
from torch.nn.utils import weight_norm, remove_weight_norm, spectral_norm
|
6 |
+
from tacotron_gst.hifigan.utils import init_weights, get_padding
|
7 |
+
|
8 |
+
LRELU_SLOPE = 0.1
|
9 |
+
|
10 |
+
|
11 |
+
class ResBlock1(torch.nn.Module):
|
12 |
+
def __init__(self, h, channels, kernel_size=3, dilation=(1, 3, 5)):
|
13 |
+
super(ResBlock1, self).__init__()
|
14 |
+
self.h = h
|
15 |
+
self.convs1 = nn.ModuleList([
|
16 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=dilation[0],
|
17 |
+
padding=get_padding(kernel_size, dilation[0]))),
|
18 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=dilation[1],
|
19 |
+
padding=get_padding(kernel_size, dilation[1]))),
|
20 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=dilation[2],
|
21 |
+
padding=get_padding(kernel_size, dilation[2])))
|
22 |
+
])
|
23 |
+
self.convs1.apply(init_weights)
|
24 |
+
|
25 |
+
self.convs2 = nn.ModuleList([
|
26 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=1,
|
27 |
+
padding=get_padding(kernel_size, 1))),
|
28 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=1,
|
29 |
+
padding=get_padding(kernel_size, 1))),
|
30 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=1,
|
31 |
+
padding=get_padding(kernel_size, 1)))
|
32 |
+
])
|
33 |
+
self.convs2.apply(init_weights)
|
34 |
+
|
35 |
+
def forward(self, x):
|
36 |
+
for c1, c2 in zip(self.convs1, self.convs2):
|
37 |
+
xt = F.leaky_relu(x, LRELU_SLOPE)
|
38 |
+
xt = c1(xt)
|
39 |
+
xt = F.leaky_relu(xt, LRELU_SLOPE)
|
40 |
+
xt = c2(xt)
|
41 |
+
x = xt + x
|
42 |
+
return x
|
43 |
+
|
44 |
+
def remove_weight_norm(self):
|
45 |
+
for l in self.convs1:
|
46 |
+
remove_weight_norm(l)
|
47 |
+
for l in self.convs2:
|
48 |
+
remove_weight_norm(l)
|
49 |
+
|
50 |
+
|
51 |
+
class ResBlock2(torch.nn.Module):
|
52 |
+
def __init__(self, h, channels, kernel_size=3, dilation=(1, 3)):
|
53 |
+
super(ResBlock2, self).__init__()
|
54 |
+
self.h = h
|
55 |
+
self.convs = nn.ModuleList([
|
56 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=dilation[0],
|
57 |
+
padding=get_padding(kernel_size, dilation[0]))),
|
58 |
+
weight_norm(Conv1d(channels, channels, kernel_size, 1, dilation=dilation[1],
|
59 |
+
padding=get_padding(kernel_size, dilation[1])))
|
60 |
+
])
|
61 |
+
self.convs.apply(init_weights)
|
62 |
+
|
63 |
+
def forward(self, x):
|
64 |
+
for c in self.convs:
|
65 |
+
xt = F.leaky_relu(x, LRELU_SLOPE)
|
66 |
+
xt = c(xt)
|
67 |
+
x = xt + x
|
68 |
+
return x
|
69 |
+
|
70 |
+
def remove_weight_norm(self):
|
71 |
+
for l in self.convs:
|
72 |
+
remove_weight_norm(l)
|
73 |
+
|
74 |
+
|
75 |
+
class Generator(torch.nn.Module):
|
76 |
+
def __init__(self, h):
|
77 |
+
super(Generator, self).__init__()
|
78 |
+
self.h = h
|
79 |
+
self.num_kernels = len(h.resblock_kernel_sizes)
|
80 |
+
self.num_upsamples = len(h.upsample_rates)
|
81 |
+
self.conv_pre = weight_norm(Conv1d(80, h.upsample_initial_channel, 7, 1, padding=3))
|
82 |
+
resblock = ResBlock1 if h.resblock == '1' else ResBlock2
|
83 |
+
|
84 |
+
self.ups = nn.ModuleList()
|
85 |
+
for i, (u, k) in enumerate(zip(h.upsample_rates, h.upsample_kernel_sizes)):
|
86 |
+
self.ups.append(weight_norm(
|
87 |
+
ConvTranspose1d(h.upsample_initial_channel//(2**i), h.upsample_initial_channel//(2**(i+1)),
|
88 |
+
k, u, padding=(k-u)//2)))
|
89 |
+
|
90 |
+
self.resblocks = nn.ModuleList()
|
91 |
+
for i in range(len(self.ups)):
|
92 |
+
ch = h.upsample_initial_channel//(2**(i+1))
|
93 |
+
for j, (k, d) in enumerate(zip(h.resblock_kernel_sizes, h.resblock_dilation_sizes)):
|
94 |
+
self.resblocks.append(resblock(h, ch, k, d))
|
95 |
+
|
96 |
+
self.conv_post = weight_norm(Conv1d(ch, 1, 7, 1, padding=3))
|
97 |
+
self.ups.apply(init_weights)
|
98 |
+
self.conv_post.apply(init_weights)
|
99 |
+
|
100 |
+
def forward(self, x):
|
101 |
+
x = self.conv_pre(x)
|
102 |
+
for i in range(self.num_upsamples):
|
103 |
+
x = F.leaky_relu(x, LRELU_SLOPE)
|
104 |
+
x = self.ups[i](x)
|
105 |
+
xs = None
|
106 |
+
for j in range(self.num_kernels):
|
107 |
+
if xs is None:
|
108 |
+
xs = self.resblocks[i*self.num_kernels+j](x)
|
109 |
+
else:
|
110 |
+
xs += self.resblocks[i*self.num_kernels+j](x)
|
111 |
+
x = xs / self.num_kernels
|
112 |
+
x = F.leaky_relu(x)
|
113 |
+
x = self.conv_post(x)
|
114 |
+
x = torch.tanh(x)
|
115 |
+
|
116 |
+
return x
|
117 |
+
|
118 |
+
def remove_weight_norm(self):
|
119 |
+
print('Removing weight norm...')
|
120 |
+
for l in self.ups:
|
121 |
+
remove_weight_norm(l)
|
122 |
+
for l in self.resblocks:
|
123 |
+
l.remove_weight_norm()
|
124 |
+
remove_weight_norm(self.conv_pre)
|
125 |
+
remove_weight_norm(self.conv_post)
|
126 |
+
|
127 |
+
|
128 |
+
class DiscriminatorP(torch.nn.Module):
|
129 |
+
def __init__(self, period, kernel_size=5, stride=3, use_spectral_norm=False):
|
130 |
+
super(DiscriminatorP, self).__init__()
|
131 |
+
self.period = period
|
132 |
+
norm_f = weight_norm if use_spectral_norm == False else spectral_norm
|
133 |
+
self.convs = nn.ModuleList([
|
134 |
+
norm_f(Conv2d(1, 32, (kernel_size, 1), (stride, 1), padding=(get_padding(5, 1), 0))),
|
135 |
+
norm_f(Conv2d(32, 128, (kernel_size, 1), (stride, 1), padding=(get_padding(5, 1), 0))),
|
136 |
+
norm_f(Conv2d(128, 512, (kernel_size, 1), (stride, 1), padding=(get_padding(5, 1), 0))),
|
137 |
+
norm_f(Conv2d(512, 1024, (kernel_size, 1), (stride, 1), padding=(get_padding(5, 1), 0))),
|
138 |
+
norm_f(Conv2d(1024, 1024, (kernel_size, 1), 1, padding=(2, 0))),
|
139 |
+
])
|
140 |
+
self.conv_post = norm_f(Conv2d(1024, 1, (3, 1), 1, padding=(1, 0)))
|
141 |
+
|
142 |
+
def forward(self, x):
|
143 |
+
fmap = []
|
144 |
+
|
145 |
+
# 1d to 2d
|
146 |
+
b, c, t = x.shape
|
147 |
+
if t % self.period != 0: # pad first
|
148 |
+
n_pad = self.period - (t % self.period)
|
149 |
+
x = F.pad(x, (0, n_pad), "reflect")
|
150 |
+
t = t + n_pad
|
151 |
+
x = x.view(b, c, t // self.period, self.period)
|
152 |
+
|
153 |
+
for l in self.convs:
|
154 |
+
x = l(x)
|
155 |
+
x = F.leaky_relu(x, LRELU_SLOPE)
|
156 |
+
fmap.append(x)
|
157 |
+
x = self.conv_post(x)
|
158 |
+
fmap.append(x)
|
159 |
+
x = torch.flatten(x, 1, -1)
|
160 |
+
|
161 |
+
return x, fmap
|
162 |
+
|
163 |
+
|
164 |
+
class MultiPeriodDiscriminator(torch.nn.Module):
|
165 |
+
def __init__(self):
|
166 |
+
super(MultiPeriodDiscriminator, self).__init__()
|
167 |
+
self.discriminators = nn.ModuleList([
|
168 |
+
DiscriminatorP(2),
|
169 |
+
DiscriminatorP(3),
|
170 |
+
DiscriminatorP(5),
|
171 |
+
DiscriminatorP(7),
|
172 |
+
DiscriminatorP(11),
|
173 |
+
])
|
174 |
+
|
175 |
+
def forward(self, y, y_hat):
|
176 |
+
y_d_rs = []
|
177 |
+
y_d_gs = []
|
178 |
+
fmap_rs = []
|
179 |
+
fmap_gs = []
|
180 |
+
for i, d in enumerate(self.discriminators):
|
181 |
+
y_d_r, fmap_r = d(y)
|
182 |
+
y_d_g, fmap_g = d(y_hat)
|
183 |
+
y_d_rs.append(y_d_r)
|
184 |
+
fmap_rs.append(fmap_r)
|
185 |
+
y_d_gs.append(y_d_g)
|
186 |
+
fmap_gs.append(fmap_g)
|
187 |
+
|
188 |
+
return y_d_rs, y_d_gs, fmap_rs, fmap_gs
|
189 |
+
|
190 |
+
|
191 |
+
class DiscriminatorS(torch.nn.Module):
|
192 |
+
def __init__(self, use_spectral_norm=False):
|
193 |
+
super(DiscriminatorS, self).__init__()
|
194 |
+
norm_f = weight_norm if use_spectral_norm == False else spectral_norm
|
195 |
+
self.convs = nn.ModuleList([
|
196 |
+
norm_f(Conv1d(1, 128, 15, 1, padding=7)),
|
197 |
+
norm_f(Conv1d(128, 128, 41, 2, groups=4, padding=20)),
|
198 |
+
norm_f(Conv1d(128, 256, 41, 2, groups=16, padding=20)),
|
199 |
+
norm_f(Conv1d(256, 512, 41, 4, groups=16, padding=20)),
|
200 |
+
norm_f(Conv1d(512, 1024, 41, 4, groups=16, padding=20)),
|
201 |
+
norm_f(Conv1d(1024, 1024, 41, 1, groups=16, padding=20)),
|
202 |
+
norm_f(Conv1d(1024, 1024, 5, 1, padding=2)),
|
203 |
+
])
|
204 |
+
self.conv_post = norm_f(Conv1d(1024, 1, 3, 1, padding=1))
|
205 |
+
|
206 |
+
def forward(self, x):
|
207 |
+
fmap = []
|
208 |
+
for l in self.convs:
|
209 |
+
x = l(x)
|
210 |
+
x = F.leaky_relu(x, LRELU_SLOPE)
|
211 |
+
fmap.append(x)
|
212 |
+
x = self.conv_post(x)
|
213 |
+
fmap.append(x)
|
214 |
+
x = torch.flatten(x, 1, -1)
|
215 |
+
|
216 |
+
return x, fmap
|
217 |
+
|
218 |
+
|
219 |
+
class MultiScaleDiscriminator(torch.nn.Module):
|
220 |
+
def __init__(self):
|
221 |
+
super(MultiScaleDiscriminator, self).__init__()
|
222 |
+
self.discriminators = nn.ModuleList([
|
223 |
+
DiscriminatorS(use_spectral_norm=True),
|
224 |
+
DiscriminatorS(),
|
225 |
+
DiscriminatorS(),
|
226 |
+
])
|
227 |
+
self.meanpools = nn.ModuleList([
|
228 |
+
AvgPool1d(4, 2, padding=2),
|
229 |
+
AvgPool1d(4, 2, padding=2)
|
230 |
+
])
|
231 |
+
|
232 |
+
def forward(self, y, y_hat):
|
233 |
+
y_d_rs = []
|
234 |
+
y_d_gs = []
|
235 |
+
fmap_rs = []
|
236 |
+
fmap_gs = []
|
237 |
+
for i, d in enumerate(self.discriminators):
|
238 |
+
if i != 0:
|
239 |
+
y = self.meanpools[i-1](y)
|
240 |
+
y_hat = self.meanpools[i-1](y_hat)
|
241 |
+
y_d_r, fmap_r = d(y)
|
242 |
+
y_d_g, fmap_g = d(y_hat)
|
243 |
+
y_d_rs.append(y_d_r)
|
244 |
+
fmap_rs.append(fmap_r)
|
245 |
+
y_d_gs.append(y_d_g)
|
246 |
+
fmap_gs.append(fmap_g)
|
247 |
+
|
248 |
+
return y_d_rs, y_d_gs, fmap_rs, fmap_gs
|
249 |
+
|
250 |
+
|
251 |
+
def feature_loss(fmap_r, fmap_g):
|
252 |
+
loss = 0
|
253 |
+
for dr, dg in zip(fmap_r, fmap_g):
|
254 |
+
for rl, gl in zip(dr, dg):
|
255 |
+
loss += torch.mean(torch.abs(rl - gl))
|
256 |
+
|
257 |
+
return loss*2
|
258 |
+
|
259 |
+
|
260 |
+
def discriminator_loss(disc_real_outputs, disc_generated_outputs):
|
261 |
+
loss = 0
|
262 |
+
r_losses = []
|
263 |
+
g_losses = []
|
264 |
+
for dr, dg in zip(disc_real_outputs, disc_generated_outputs):
|
265 |
+
r_loss = torch.mean((1-dr)**2)
|
266 |
+
g_loss = torch.mean(dg**2)
|
267 |
+
loss += (r_loss + g_loss)
|
268 |
+
r_losses.append(r_loss.item())
|
269 |
+
g_losses.append(g_loss.item())
|
270 |
+
|
271 |
+
return loss, r_losses, g_losses
|
272 |
+
|
273 |
+
|
274 |
+
def generator_loss(disc_outputs):
|
275 |
+
loss = 0
|
276 |
+
gen_losses = []
|
277 |
+
for dg in disc_outputs:
|
278 |
+
l = torch.mean((1-dg)**2)
|
279 |
+
gen_losses.append(l)
|
280 |
+
loss += l
|
281 |
+
|
282 |
+
return loss, gen_losses
|
283 |
+
|
hifigan/train.py
ADDED
@@ -0,0 +1,271 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import warnings
|
2 |
+
warnings.simplefilter(action='ignore', category=FutureWarning)
|
3 |
+
import itertools
|
4 |
+
import os
|
5 |
+
import time
|
6 |
+
import argparse
|
7 |
+
import json
|
8 |
+
import torch
|
9 |
+
import torch.nn.functional as F
|
10 |
+
from torch.utils.tensorboard import SummaryWriter
|
11 |
+
from torch.utils.data import DistributedSampler, DataLoader
|
12 |
+
import torch.multiprocessing as mp
|
13 |
+
from torch.distributed import init_process_group
|
14 |
+
from torch.nn.parallel import DistributedDataParallel
|
15 |
+
from env import AttrDict, build_env
|
16 |
+
from tacotron_gst.hifigan.meldataset import MelDataset, mel_spectrogram, get_dataset_filelist
|
17 |
+
from tacotron_gst.hifigan.models import Generator, MultiPeriodDiscriminator, MultiScaleDiscriminator, feature_loss, generator_loss,\
|
18 |
+
discriminator_loss
|
19 |
+
from tacotron_gst.hifigan.utils import plot_spectrogram, scan_checkpoint, load_checkpoint, save_checkpoint
|
20 |
+
|
21 |
+
torch.backends.cudnn.benchmark = True
|
22 |
+
|
23 |
+
|
24 |
+
def train(rank, a, h):
|
25 |
+
if h.num_gpus > 1:
|
26 |
+
init_process_group(backend=h.dist_config['dist_backend'], init_method=h.dist_config['dist_url'],
|
27 |
+
world_size=h.dist_config['world_size'] * h.num_gpus, rank=rank)
|
28 |
+
|
29 |
+
torch.cuda.manual_seed(h.seed)
|
30 |
+
device = torch.device('cuda:{:d}'.format(rank))
|
31 |
+
|
32 |
+
generator = Generator(h).to(device)
|
33 |
+
mpd = MultiPeriodDiscriminator().to(device)
|
34 |
+
msd = MultiScaleDiscriminator().to(device)
|
35 |
+
|
36 |
+
if rank == 0:
|
37 |
+
print(generator)
|
38 |
+
os.makedirs(a.checkpoint_path, exist_ok=True)
|
39 |
+
print("checkpoints directory : ", a.checkpoint_path)
|
40 |
+
|
41 |
+
if os.path.isdir(a.checkpoint_path):
|
42 |
+
cp_g = scan_checkpoint(a.checkpoint_path, 'g_')
|
43 |
+
cp_do = scan_checkpoint(a.checkpoint_path, 'do_')
|
44 |
+
|
45 |
+
steps = 0
|
46 |
+
if cp_g is None or cp_do is None:
|
47 |
+
state_dict_do = None
|
48 |
+
last_epoch = -1
|
49 |
+
else:
|
50 |
+
state_dict_g = load_checkpoint(cp_g, device)
|
51 |
+
state_dict_do = load_checkpoint(cp_do, device)
|
52 |
+
generator.load_state_dict(state_dict_g['generator'])
|
53 |
+
mpd.load_state_dict(state_dict_do['mpd'])
|
54 |
+
msd.load_state_dict(state_dict_do['msd'])
|
55 |
+
steps = state_dict_do['steps'] + 1
|
56 |
+
last_epoch = state_dict_do['epoch']
|
57 |
+
|
58 |
+
if h.num_gpus > 1:
|
59 |
+
generator = DistributedDataParallel(generator, device_ids=[rank]).to(device)
|
60 |
+
mpd = DistributedDataParallel(mpd, device_ids=[rank]).to(device)
|
61 |
+
msd = DistributedDataParallel(msd, device_ids=[rank]).to(device)
|
62 |
+
|
63 |
+
optim_g = torch.optim.AdamW(generator.parameters(), h.learning_rate, betas=[h.adam_b1, h.adam_b2])
|
64 |
+
optim_d = torch.optim.AdamW(itertools.chain(msd.parameters(), mpd.parameters()),
|
65 |
+
h.learning_rate, betas=[h.adam_b1, h.adam_b2])
|
66 |
+
|
67 |
+
if state_dict_do is not None:
|
68 |
+
optim_g.load_state_dict(state_dict_do['optim_g'])
|
69 |
+
optim_d.load_state_dict(state_dict_do['optim_d'])
|
70 |
+
|
71 |
+
scheduler_g = torch.optim.lr_scheduler.ExponentialLR(optim_g, gamma=h.lr_decay, last_epoch=last_epoch)
|
72 |
+
scheduler_d = torch.optim.lr_scheduler.ExponentialLR(optim_d, gamma=h.lr_decay, last_epoch=last_epoch)
|
73 |
+
|
74 |
+
training_filelist, validation_filelist = get_dataset_filelist(a)
|
75 |
+
|
76 |
+
trainset = MelDataset(training_filelist, h.segment_size, h.n_fft, h.num_mels,
|
77 |
+
h.hop_size, h.win_size, h.sampling_rate, h.fmin, h.fmax, n_cache_reuse=0,
|
78 |
+
shuffle=False if h.num_gpus > 1 else True, fmax_loss=h.fmax_for_loss, device=device,
|
79 |
+
fine_tuning=a.fine_tuning, base_mels_path=a.input_mels_dir)
|
80 |
+
|
81 |
+
train_sampler = DistributedSampler(trainset) if h.num_gpus > 1 else None
|
82 |
+
|
83 |
+
train_loader = DataLoader(trainset, num_workers=h.num_workers, shuffle=False,
|
84 |
+
sampler=train_sampler,
|
85 |
+
batch_size=h.batch_size,
|
86 |
+
pin_memory=True,
|
87 |
+
drop_last=True)
|
88 |
+
|
89 |
+
if rank == 0:
|
90 |
+
validset = MelDataset(validation_filelist, h.segment_size, h.n_fft, h.num_mels,
|
91 |
+
h.hop_size, h.win_size, h.sampling_rate, h.fmin, h.fmax, False, False, n_cache_reuse=0,
|
92 |
+
fmax_loss=h.fmax_for_loss, device=device, fine_tuning=a.fine_tuning,
|
93 |
+
base_mels_path=a.input_mels_dir)
|
94 |
+
validation_loader = DataLoader(validset, num_workers=1, shuffle=False,
|
95 |
+
sampler=None,
|
96 |
+
batch_size=1,
|
97 |
+
pin_memory=True,
|
98 |
+
drop_last=True)
|
99 |
+
|
100 |
+
sw = SummaryWriter(os.path.join(a.checkpoint_path, 'logs'))
|
101 |
+
|
102 |
+
generator.train()
|
103 |
+
mpd.train()
|
104 |
+
msd.train()
|
105 |
+
for epoch in range(max(0, last_epoch), a.training_epochs):
|
106 |
+
if rank == 0:
|
107 |
+
start = time.time()
|
108 |
+
print("Epoch: {}".format(epoch+1))
|
109 |
+
|
110 |
+
if h.num_gpus > 1:
|
111 |
+
train_sampler.set_epoch(epoch)
|
112 |
+
|
113 |
+
for i, batch in enumerate(train_loader):
|
114 |
+
if rank == 0:
|
115 |
+
start_b = time.time()
|
116 |
+
x, y, _, y_mel = batch
|
117 |
+
x = torch.autograd.Variable(x.to(device, non_blocking=True))
|
118 |
+
y = torch.autograd.Variable(y.to(device, non_blocking=True))
|
119 |
+
y_mel = torch.autograd.Variable(y_mel.to(device, non_blocking=True))
|
120 |
+
y = y.unsqueeze(1)
|
121 |
+
|
122 |
+
y_g_hat = generator(x)
|
123 |
+
y_g_hat_mel = mel_spectrogram(y_g_hat.squeeze(1), h.n_fft, h.num_mels, h.sampling_rate, h.hop_size, h.win_size,
|
124 |
+
h.fmin, h.fmax_for_loss)
|
125 |
+
|
126 |
+
optim_d.zero_grad()
|
127 |
+
|
128 |
+
# MPD
|
129 |
+
y_df_hat_r, y_df_hat_g, _, _ = mpd(y, y_g_hat.detach())
|
130 |
+
loss_disc_f, losses_disc_f_r, losses_disc_f_g = discriminator_loss(y_df_hat_r, y_df_hat_g)
|
131 |
+
|
132 |
+
# MSD
|
133 |
+
y_ds_hat_r, y_ds_hat_g, _, _ = msd(y, y_g_hat.detach())
|
134 |
+
loss_disc_s, losses_disc_s_r, losses_disc_s_g = discriminator_loss(y_ds_hat_r, y_ds_hat_g)
|
135 |
+
|
136 |
+
loss_disc_all = loss_disc_s + loss_disc_f
|
137 |
+
|
138 |
+
loss_disc_all.backward()
|
139 |
+
optim_d.step()
|
140 |
+
|
141 |
+
# Generator
|
142 |
+
optim_g.zero_grad()
|
143 |
+
|
144 |
+
# L1 Mel-Spectrogram Loss
|
145 |
+
loss_mel = F.l1_loss(y_mel, y_g_hat_mel) * 45
|
146 |
+
|
147 |
+
y_df_hat_r, y_df_hat_g, fmap_f_r, fmap_f_g = mpd(y, y_g_hat)
|
148 |
+
y_ds_hat_r, y_ds_hat_g, fmap_s_r, fmap_s_g = msd(y, y_g_hat)
|
149 |
+
loss_fm_f = feature_loss(fmap_f_r, fmap_f_g)
|
150 |
+
loss_fm_s = feature_loss(fmap_s_r, fmap_s_g)
|
151 |
+
loss_gen_f, losses_gen_f = generator_loss(y_df_hat_g)
|
152 |
+
loss_gen_s, losses_gen_s = generator_loss(y_ds_hat_g)
|
153 |
+
loss_gen_all = loss_gen_s + loss_gen_f + loss_fm_s + loss_fm_f + loss_mel
|
154 |
+
|
155 |
+
loss_gen_all.backward()
|
156 |
+
optim_g.step()
|
157 |
+
|
158 |
+
if rank == 0:
|
159 |
+
# STDOUT logging
|
160 |
+
if steps % a.stdout_interval == 0:
|
161 |
+
with torch.no_grad():
|
162 |
+
mel_error = F.l1_loss(y_mel, y_g_hat_mel).item()
|
163 |
+
|
164 |
+
print('Steps : {:d}, Gen Loss Total : {:4.3f}, Mel-Spec. Error : {:4.3f}, s/b : {:4.3f}'.
|
165 |
+
format(steps, loss_gen_all, mel_error, time.time() - start_b))
|
166 |
+
|
167 |
+
# checkpointing
|
168 |
+
if steps % a.checkpoint_interval == 0 and steps != 0:
|
169 |
+
checkpoint_path = "{}/g_{:08d}".format(a.checkpoint_path, steps)
|
170 |
+
save_checkpoint(checkpoint_path,
|
171 |
+
{'generator': (generator.module if h.num_gpus > 1 else generator).state_dict()})
|
172 |
+
checkpoint_path = "{}/do_{:08d}".format(a.checkpoint_path, steps)
|
173 |
+
save_checkpoint(checkpoint_path,
|
174 |
+
{'mpd': (mpd.module if h.num_gpus > 1
|
175 |
+
else mpd).state_dict(),
|
176 |
+
'msd': (msd.module if h.num_gpus > 1
|
177 |
+
else msd).state_dict(),
|
178 |
+
'optim_g': optim_g.state_dict(), 'optim_d': optim_d.state_dict(), 'steps': steps,
|
179 |
+
'epoch': epoch})
|
180 |
+
|
181 |
+
# Tensorboard summary logging
|
182 |
+
if steps % a.summary_interval == 0:
|
183 |
+
sw.add_scalar("training/gen_loss_total", loss_gen_all, steps)
|
184 |
+
sw.add_scalar("training/mel_spec_error", mel_error, steps)
|
185 |
+
|
186 |
+
# Validation
|
187 |
+
if steps % a.validation_interval == 0: # and steps != 0:
|
188 |
+
generator.eval()
|
189 |
+
torch.cuda.empty_cache()
|
190 |
+
val_err_tot = 0
|
191 |
+
with torch.no_grad():
|
192 |
+
for j, batch in enumerate(validation_loader):
|
193 |
+
x, y, _, y_mel = batch
|
194 |
+
y_g_hat = generator(x.to(device))
|
195 |
+
y_mel = torch.autograd.Variable(y_mel.to(device, non_blocking=True))
|
196 |
+
y_g_hat_mel = mel_spectrogram(y_g_hat.squeeze(1), h.n_fft, h.num_mels, h.sampling_rate,
|
197 |
+
h.hop_size, h.win_size,
|
198 |
+
h.fmin, h.fmax_for_loss)
|
199 |
+
val_err_tot += F.l1_loss(y_mel, y_g_hat_mel).item()
|
200 |
+
|
201 |
+
if j <= 4:
|
202 |
+
if steps == 0:
|
203 |
+
sw.add_audio('gt/y_{}'.format(j), y[0], steps, h.sampling_rate)
|
204 |
+
sw.add_figure('gt/y_spec_{}'.format(j), plot_spectrogram(x[0]), steps)
|
205 |
+
|
206 |
+
sw.add_audio('generated/y_hat_{}'.format(j), y_g_hat[0], steps, h.sampling_rate)
|
207 |
+
y_hat_spec = mel_spectrogram(y_g_hat.squeeze(1), h.n_fft, h.num_mels,
|
208 |
+
h.sampling_rate, h.hop_size, h.win_size,
|
209 |
+
h.fmin, h.fmax)
|
210 |
+
sw.add_figure('generated/y_hat_spec_{}'.format(j),
|
211 |
+
plot_spectrogram(y_hat_spec.squeeze(0).cpu().numpy()), steps)
|
212 |
+
|
213 |
+
val_err = val_err_tot / (j+1)
|
214 |
+
sw.add_scalar("validation/mel_spec_error", val_err, steps)
|
215 |
+
|
216 |
+
generator.train()
|
217 |
+
|
218 |
+
steps += 1
|
219 |
+
|
220 |
+
scheduler_g.step()
|
221 |
+
scheduler_d.step()
|
222 |
+
|
223 |
+
if rank == 0:
|
224 |
+
print('Time taken for epoch {} is {} sec\n'.format(epoch + 1, int(time.time() - start)))
|
225 |
+
|
226 |
+
|
227 |
+
def main():
|
228 |
+
print('Initializing Training Process..')
|
229 |
+
|
230 |
+
parser = argparse.ArgumentParser()
|
231 |
+
|
232 |
+
parser.add_argument('--group_name', default=None)
|
233 |
+
parser.add_argument('--input_wavs_dir', default='LJSpeech-1.1/wavs')
|
234 |
+
parser.add_argument('--input_mels_dir', default='ft_dataset')
|
235 |
+
parser.add_argument('--input_training_file', default='LJSpeech-1.1/training.txt')
|
236 |
+
parser.add_argument('--input_validation_file', default='LJSpeech-1.1/validation.txt')
|
237 |
+
parser.add_argument('--checkpoint_path', default='cp_hifigan')
|
238 |
+
parser.add_argument('--config', default='')
|
239 |
+
parser.add_argument('--training_epochs', default=3100, type=int)
|
240 |
+
parser.add_argument('--stdout_interval', default=5, type=int)
|
241 |
+
parser.add_argument('--checkpoint_interval', default=5000, type=int)
|
242 |
+
parser.add_argument('--summary_interval', default=100, type=int)
|
243 |
+
parser.add_argument('--validation_interval', default=1000, type=int)
|
244 |
+
parser.add_argument('--fine_tuning', default=False, type=bool)
|
245 |
+
|
246 |
+
a = parser.parse_args()
|
247 |
+
|
248 |
+
with open(a.config) as f:
|
249 |
+
data = f.read()
|
250 |
+
|
251 |
+
json_config = json.loads(data)
|
252 |
+
h = AttrDict(json_config)
|
253 |
+
build_env(a.config, 'config.json', a.checkpoint_path)
|
254 |
+
|
255 |
+
torch.manual_seed(h.seed)
|
256 |
+
if torch.cuda.is_available():
|
257 |
+
torch.cuda.manual_seed(h.seed)
|
258 |
+
h.num_gpus = torch.cuda.device_count()
|
259 |
+
h.batch_size = int(h.batch_size / h.num_gpus)
|
260 |
+
print('Batch size per GPU :', h.batch_size)
|
261 |
+
else:
|
262 |
+
pass
|
263 |
+
|
264 |
+
if h.num_gpus > 1:
|
265 |
+
mp.spawn(train, nprocs=h.num_gpus, args=(a, h,))
|
266 |
+
else:
|
267 |
+
train(0, a, h)
|
268 |
+
|
269 |
+
|
270 |
+
if __name__ == '__main__':
|
271 |
+
main()
|
hifigan/utils.py
ADDED
@@ -0,0 +1,58 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import glob
|
2 |
+
import os
|
3 |
+
import matplotlib
|
4 |
+
import torch
|
5 |
+
from torch.nn.utils import weight_norm
|
6 |
+
matplotlib.use("Agg")
|
7 |
+
import matplotlib.pylab as plt
|
8 |
+
|
9 |
+
|
10 |
+
def plot_spectrogram(spectrogram):
|
11 |
+
fig, ax = plt.subplots(figsize=(10, 2))
|
12 |
+
im = ax.imshow(spectrogram, aspect="auto", origin="lower",
|
13 |
+
interpolation='none')
|
14 |
+
plt.colorbar(im, ax=ax)
|
15 |
+
|
16 |
+
fig.canvas.draw()
|
17 |
+
plt.close()
|
18 |
+
|
19 |
+
return fig
|
20 |
+
|
21 |
+
|
22 |
+
def init_weights(m, mean=0.0, std=0.01):
|
23 |
+
classname = m.__class__.__name__
|
24 |
+
if classname.find("Conv") != -1:
|
25 |
+
m.weight.data.normal_(mean, std)
|
26 |
+
|
27 |
+
|
28 |
+
def apply_weight_norm(m):
|
29 |
+
classname = m.__class__.__name__
|
30 |
+
if classname.find("Conv") != -1:
|
31 |
+
weight_norm(m)
|
32 |
+
|
33 |
+
|
34 |
+
def get_padding(kernel_size, dilation=1):
|
35 |
+
return int((kernel_size*dilation - dilation)/2)
|
36 |
+
|
37 |
+
|
38 |
+
def load_checkpoint(filepath, device):
|
39 |
+
assert os.path.isfile(filepath)
|
40 |
+
print("Loading '{}'".format(filepath))
|
41 |
+
checkpoint_dict = torch.load(filepath, map_location=device)
|
42 |
+
print("Complete.")
|
43 |
+
return checkpoint_dict
|
44 |
+
|
45 |
+
|
46 |
+
def save_checkpoint(filepath, obj):
|
47 |
+
print("Saving checkpoint to {}".format(filepath))
|
48 |
+
torch.save(obj, filepath)
|
49 |
+
print("Complete.")
|
50 |
+
|
51 |
+
|
52 |
+
def scan_checkpoint(cp_dir, prefix):
|
53 |
+
pattern = os.path.join(cp_dir, prefix + '????????')
|
54 |
+
cp_list = glob.glob(pattern)
|
55 |
+
if len(cp_list) == 0:
|
56 |
+
return None
|
57 |
+
return sorted(cp_list)[-1]
|
58 |
+
|