Machine Learning Street Talk (YouTube and Audio Podcast). Run by Dr. Tim Scarfe @ecsquendor, Dr. Yannic Kilcher @ykilcher and Dr. Keith Duggar.

London
Joined April 2020
We spoke with @fchollet about neural program synthesis, the manifold hypothesis, type 1 and type 2 generalisation, the measure of intelligence, the intelligence explosion and Keras! invidio.us/J0p_thJJnoo with @ykilcher @ecsquendor
14
98
10
368
Show this thread
Machine Learning Street Talk retweeted
A common beginner mistake is to misunderstand the meaning of the term "interpolation" in machine learning. Let's take a look ⬇️⬇️⬇️
21
390
76
1,887
Show this thread
Machine Learning Street Talk retweeted
If you've been following work in NLP commonsense reasoning, this is worth a read:
Advancements in Commonsense Reasoning and Language Models make the Winograd Schema look solved. After revisiting the experimental setup and evaluation, we show that models perform randomly! Our latest work at EMNLP, w/ @hongming110, @yoavgo, @DanRothNLP arxiv.org/abs/2104.08161
Show this thread
2
7
1
59
Machine Learning Street Talk retweeted
Any problem can be solved via associative intuition (pattern recognition) if you've already seen hundreds of variants of the problem before. The value of principled reasoning lies in the ability to handle problems you've never seen before.
13
87
5
522
Machine Learning Street Talk retweeted
SHOCKING: If you create a dataset of questions, then delete all questions GPT-3 gets right, you'll end up with a dataset of questions that GPT-3 gets wrong😱
🔥This is strange🔥 The MORE parameters GPT-3 gets, the more LIES it outputs🤯Large Language Models appear to get worse as they get bigger! But is the outrage justified? What are people not telling you about this story? Watch the video to find out👀 invidio.us/aX8phGhG8VQ
Show this thread
6
17
1
171
Machine Learning Street Talk retweeted
Special thanks to @fhuszar for this fantastic meme 🔥 "The Avengers of Geometric Deep Learning" @mmbronstein @ecsquendor @ykilcher @MLStreetTalk
3
6
1
99
Machine Learning Street Talk retweeted
🥳New Video🥳Topographic VAEs explicitly structure the latent space into capsules of correlated, topographically organized variables. This allows the model to learn equivariance relations between latent and data spaces. Check out the video💪 invidio.us/pBau7umFhjQ @wellingmax
3
35
5
240
Machine Learning Street Talk retweeted
Amazing video! Fantastic book, too, I am glad it receives all this attention. Many of those ideas & concepts are very fundamental and so helpful to understand, regardless of which specific sub-field of machine learning one is in.
Epic special edition MLST on geometric deep learning! Been in the works since May! with @mmbronstein @PetarV_93 @TacoCohen @joanbruna @ykilcher @ecsquendor invidio.us/bIZB1hIJ4u8
Show this thread
0
4
0
39
Epic special edition MLST on geometric deep learning! Been in the works since May! with @mmbronstein @PetarV_93 @TacoCohen @joanbruna @ykilcher @ecsquendor invidio.us/bIZB1hIJ4u8
2
73
15
355
Check out their geometric deep learning proto-book here; arxiv.org/abs/2104.13478 "Geometric Deep Learning: Grids, Groups, Graphs, Geodesics, and Gauges" Michael M. Bronstein, Joan Bruna, Taco Cohen, Petar Veličković -- it must have been a herculean effort for them to produce it
1
4
0
34
Lightspeed Kilcher has just broken the 100K boundary, he is now accelerating to 200K! Amazing job!
Thank you all for 100k subscribers! Here's to the next epoch 🥳 invidio.us/ifBI2jTaAEo
0
1
0
26
Machine Learning Street Talk retweeted
We've founded a non-profit, LAION, which aims to democratize large-scale ML models and datasets, and released 400M-LAION, a dataset with 400M CLIP-filtered image-text pairs! Project: laion.ai/laion-400-open-data… About LAION: laion.ai/ A paper will be released soon.
10
133
6
514
Show this thread
Machine Learning Street Talk retweeted
A year ago I asked: Is there more than Self-Attention and Deep Sets? - and got very insightful answers. 🙏 Now, Ed, Martin and I wrote up our own take on the various neural networks architectures for sets. Have a look and tell us what you think! :) ➡️fabianfuchsml.github.io/lear… ☕️
Both Max-Pooling (e.g. DeepSets) and Self-Attention are permutation invariant/equivariant neural network architectures for set-based problems. I am aware of a couple of variations for both of these. Are there additional, fundamentally different architectures for sets? 🤔
1
75
3
326
Machine Learning Street Talk retweeted
There is lot more to natural languages than text: tone, accent, expression, prosody, timbre, pitch..... "Textless NLP" represents speech through a stream of discrete tokens, automatically learned through self-supervised learning, directly fed with raw speech waveform! A new era.
We’re introducing GSLM, the first language model that breaks free completely of the dependence on text for training. This “textless NLP” approach learns to generate expressive speech using only raw audio recordings as input. Learn more and get the code: ai.facebook.com/blog/textles…
20
215
22
1,278
Machine Learning Street Talk retweeted
Super proud to present our new work (arxiv.org/abs/2109.04173) led by Matej on connecting GNNs with causality. We present rigorous theoretical results derived from 1st principles and establish a novel connection between GNN and SCM. Joint work with @PetarV_93 @kerstingAIML
3
30
1
100
Machine Learning Street Talk retweeted
We loved listening to and watching this one. Incredibly captivating intro and insightful conversation. Thanks for having Jeff on! @MLStreetTalk
We had a fascinating conversation with Jeff Hawkins from @Numenta discussing his thousand brains theory of intelligence. invidio.us/6VQILbDqaI4 with @DoctorDuggar @NPCollapse
Show this thread
0
8
0
38