About model details

#1
by chujiezheng - opened

Hi, thanks for your sharing! I have several questions about this repo:

  • How much Reddit data was this BART model post-trained on? (e.g., session number, response number and length, etc.)
  • What is the input format of this BART model? Suppose I have a multi-turn dialog context (u1, u2, u3) and would like to generate the next response u4. What format should I organize (u1, u2, u3) into to be fed into the encoder?
chujiezheng changed discussion status to closed
chujiezheng changed discussion status to open
chujiezheng changed discussion status to closed

Sign up or log in to comment