Archives: Tweets
-
@mmitchell_ai @BayesForDays @haldaume3 as soon as…
@mmitchell_ai @BayesForDays @haldaume3 as soon as you have >1 timescale on which you’re simultaneously training, need to cope.
-
Uhura smacking Chekhov because she’s made at Kirk….
Uhura smacking Chekhov because she’s made at Kirk. Hon, you can’t displace your anger at authority like that. https://t.co/Lsrlt0IeXk
-
@mmitchell_ai @BayesForDays @haldaume3 better(?) y…
@mmitchell_ai @BayesForDays @haldaume3 better(?) yet, hidden phonetic layer input?
-
@mmitchell_ai @BayesForDays @haldaume3 *not phonet…
@mmitchell_ai @BayesForDays @haldaume3 *not phonetics. Language modeling is hard, swipe keyboard.
-
@mmitchell_ai @BayesForDays @haldaume3 smoothed ng…
@mmitchell_ai @BayesForDays @haldaume3 smoothed ngram model captures rarity well, but bit phonetics
-
@mmitchell_ai @BayesForDays @haldaume3 YES to this…
@mmitchell_ai @BayesForDays @haldaume3 YES to this direction
-
@mmitchell_ai @BayesForDays @haldaume3 I have some…
@mmitchell_ai @BayesForDays @haldaume3 I have some ideas about how to make “explainable” lower layers in an RNN. Love to chat some time.
-
@mmitchell_ai @BayesForDays @haldaume3 right, *dom…
@mmitchell_ai @BayesForDays @haldaume3 right, *domain* specific, but no hidden task-specific “explainable” model
-
@mmitchell_ai @BayesForDays @haldaume3 all your ex…
@mmitchell_ai @BayesForDays @haldaume3 all your examples sound like British style crossword clues
-
@mmitchell_ai @haldaume3 @BayesForDays are you per…
@mmitchell_ai @haldaume3 @BayesForDays are you performing adversarial learning live on Twitter?