Engineers at UCLA explain how A.I. systems should be designed to both perform a task and win the trust of humans... (more…)
Read more »
We’re releasing an analysis showing that since 2012 the amount of compute needed to train a neural net to the same performance on ImageNet classification has been decreasing by a factor of 2 every 16 months. Compared to 2012, it now takes 44 times less co... (more…)
Read more »
Two research groups manage similar feats through very different methods. (more…)
Read more »
AI caught everyone’s attention in 2023 with Large Language Models (LLMs) that can be instructed to perform general tasks, such as translation or coding, just by prompting. This naturally led to an intense focus on models as the primary ingredient in AI ap... (more…)
Read more »
If you’ve followed parts 1, 2, 3, 4 and 5 of this series you know that you really don’t need a lot of math to get started with AI. You can… (more…)
Read more »