laarc.io. Looking for AI work. DMs open. ML discord: discordapp.com/invite/x52Xz3… projects: status.shawwn.com

St Louis, MO
Joined January 2009
Last night, someone asked me what I've been up to since 2010. My reply turned into a short autobiography. I considered deleting it, but people encouraged me to post it instead: gist.github.com/shawwn/3110a… If you're unhappy with your life, it's important to believe you can fix it.
10
4
1
146
DALL·E mini apparently knows a lot about artwork!
2
4
3
100
Show this thread
*laughs in JAX*
Explaining Julia to R and Python programmers caused me to create
0
1
0
17
Shawn Presser retweeted
When you try to get a car with hair but mismatch SG2 layers and get cat-formers
1
22
1
255
GIF
Show this thread
My blind friend explained to me how playing a video game helped him learn to walk around in the real world. I thought it was really cool. We take for granted that if you press up, you walk upwards in 2D isometric games. But here, it was different, which had an advantage:
Replying to @theshawwn
Unlike most games for the blind, pressing up arrow didn’t always move you north, it moved you forward. Where forward was depended on where you were facing. So, you could traverse a hallway with up arrow, and then go back with the same key, by just turning 180 degrees.
0
0
0
2
I wrote my own terminal-based beatsaber client. It uses python print() to do all the drawing. beatsaber-cli code: github.com/shawwn/scrap/blob… I made this partly to study beatsaber patterns, and partly to see how far you can push terminal-based gaming. Turns out, pretty far.
3
4
1
28
859
By request: Spin Eternally invidio.us/watch?v=ZH2fjP-M…
2
0
0
3
145
Test tweet, please ignore.
2
0
0
5
377
Shawn Presser retweeted
✅We created an Open Source version of Dall-E using VQGAN+BART+CLIP. ✅We also fine-tuned CLIP for satellite data (another team I was in) ✅I released the first GPT2 (117mn) for Bengali trained from scratch. All using JAX/Flax on TPUv3-8. Pretty wild experience. Learned a lot.
0
2
0
11
This hit me right in the nostalgia.
web directory, usenet, a count of web pages
1
0
0
3
Shawn Presser retweeted
pretty crazy, close to 800 people joined 🤯 you can read more about it here: github.com/huggingface/trans… there were some nice (recorded) talks from JAX/Google/DeepMind people as well
1
1
0
5
This data was so striking that I made a graph. JAX accounts for 30% of all models uploaded to @huggingface hub. PyTorch: 63.2% JAX: 29.6% Tensorflow: 7.3%
Replying to @Sopcaja @tunguz
This is not necessarily representative data, but it is interesting to note that among the models in @huggingface hub, 1,186 were written in TF, where JAX was used in 4,830 models. PyTorch had 10,322 models, btw. huggingface.co/models?filter…
7
9
4
89
To get started with JAX on TPU VMs, here's a... love story? I'm not sure what to call it. A guide, I suppose: blog.gpt4.org/jaxtpu
2
3
0
14
Unfortunately, my graph is wildly mistaken. nitter.net/AllesistKode/sta… "*JAX-only* models are numbered at 131, currently" -- i.e. a tenth of Tensorflow. Over 3k JAX models were autogenerated from BERT checkpoints. Still, nobody's converting models to TF, so Jax has momentum.
Many models were converted to JAX from PyTorch. *JAX-only* models are numbered at 131, currently. [ n(PyTorch ∩ Jax) - n(PyTorch) ] † † This method ignores models that were written in Jax but were converted to PyTorch.
2
0
1
8
I never know what to do in situations like this. I could delete it, but on the other hand, it *is* true that there are 4k JAX models on huggingface hub. Even if they were mostly autogenerated, they're still models that work. It'd be pointless to rewrite and retrain them.
1
0
0
4
Whoa. I missed this when it made the rounds a week ago. I was about to thank the person that pointed it out to me, but it turns out they were the author. :) I take it back; *these* are the most coherent, structural AI images I've seen.
Come, walk with me for a while through latent space. moultano.wordpress.com/2021/…
Show this thread
0
2
0
18