Learning Bayesian Statistics

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!

Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!

Visit our Patreon page to unlock exclusive Bayesian swag 😉

Takeaways:

  • Teaching Bayesian Concepts Using M&Ms: Tomi Capretto uses an engaging classroom exercise involving M&Ms to teach Bayesian statistics, making abstract concepts tangible and intuitive for students.
  • Practical Applications of Bayesian Methods: Discussion on the real-world application of Bayesian methods in projects at PyMC Labs and in university settings, emphasizing the practical impact and accessibility of Bayesian statistics.
  • Contributions to Open-Source Software: Tomi’s involvement in developing Bambi and other open-source tools demonstrates the importance of community contributions to advancing statistical software.
  • Challenges in Statistical Education: Tomi talks about the challenges and rewards of teaching complex statistical concepts to students who are accustomed to frequentist approaches, highlighting the shift to thinking probabilistically in Bayesian frameworks.
  • Future of Bayesian Tools: The discussion also touches on the future enhancements for Bambi and PyMC, aiming to make these tools more robust and user-friendly for a wider audience, including those who are not professional statisticians. 

Chapters:

05:36 Tomi’s Work and Teaching

10:28 Teaching Complex Statistical Concepts with Practical Exercises

23:17 Making Bayesian Modeling Accessible in Python

38:46 Advanced Regression with Bambi

41:14 The Power of Linear Regression

42:45 Exploring Advanced Regression Techniques

44:11 Regression Models and Dot Products

45:37 Advanced Concepts in Regression

46:36 Diagnosing and Handling Overdispersion

47:35 Parameter Identifiability and Overparameterization

50:29 Visualizations and Course Highlights

51:30 Exploring Niche and Advanced Concepts

56:56 The Power of Zero-Sum Normal

59:59 The Value of Exercises and Community

01:01:56 Optimizing Computation with Sparse Matrices

01:13:37 Avoiding MCMC and Exploring Alternatives

01:18:27 Making Connections Between Different Models

Thank you to my Patrons for making this episode possible!

Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.

Links from the show:

Transcript

This is an automatic transcript and may therefore contain errors. Please get in touch if you’re willing to correct them.

Transcript
Speaker:

Today I am thrilled to host my friend Tommy Capretto, a multifaceted data scientist from

PMC Labs, a dedicated statistics educator at Universidad Nacional de Rosario, and an avid

2

00:00:18,683 --> 00:00:24,886

contributor to the open source software community, especially known for his work on Bambi.

3

00:00:24,886 --> 00:00:32,108

In our conversation, Tommy shares insights from his dual role as an industry practitioner

and an academic,

4

00:00:32,108 --> 00:00:41,664

We dive deep into the practicalities and pedagogical approaches of teaching complex

statistical concepts, making them accessible and engaging.

5

00:00:41,965 --> 00:00:49,150

We also explored Tommy's contributions to BEMBEE, which he describes as BRMS for Python.

6

00:00:49,150 --> 00:00:57,275

And indeed, it is a Python library designed to make patient modeling more approachable for

beginners and non -experts.

7

00:00:57,275 --> 00:01:01,706

This discussion leads us into the heart of our newly launched course,

8

00:01:01,706 --> 00:01:13,574

Advanced Regression with Bambi and Pimc, where Tommy, Ravin Kumar and myself unpack the

essentials of regression models, tackle the challenges of parameter identifiability and

9

00:01:13,574 --> 00:01:20,239

overparameterization, and address overdispersion and the new zero -sum normal

distribution.

10

00:01:20,419 --> 00:01:30,478

So whether you're a student, a professional, or just a curious mind, I'm sure this episode

is packed with insights that will enrich your understanding.

11

00:01:30,478 --> 00:01:31,758

statistical world.

12

00:01:31,758 --> 00:01:38,832

This is Learn Invasion Statistics, episode 112, recorded June 24, 2024.

13

00:01:40,812 --> 00:01:48,954

Welcome Bayesian Statistics, a podcast about Bayesian inference, the methods, the

projects, and the people who make it possible.

14

00:01:48,954 --> 00:01:51,125

I'm your host, Alex Andorra.

15

00:01:51,125 --> 00:01:55,546

You can follow me on Twitter at alex -underscore

16

00:02:10,358 --> 00:02:11,199

like the country.

17

00:02:11,199 --> 00:02:15,422

For any info about the show, learnbasedats .com is Laplace to be.

18

00:02:15,422 --> 00:02:22,608

Show notes, becoming a corporate sponsor, unlocking Beijing Merch, supporting the show on

Patreon, everything is in there.

19

00:02:22,608 --> 00:02:24,530

That's learnbasedats .com.

20

00:02:24,530 --> 00:02:34,879

If you're interested in one -on -one mentorship, online courses, or statistical

consulting, feel free to reach out and book a call at topmate .io slash alex underscore

21

00:02:34,879 --> 00:02:35,559

and dora.

22

00:02:35,559 --> 00:02:39,302

See you around, folks, and best Beijing wishes to you

23

00:02:42,670 --> 00:02:44,270

Hello, my dear Vagans!

24

00:02:44,270 --> 00:02:47,090

A quick note before today's episode.

25

00:02:47,090 --> 00:02:50,370

STANCON 2024 is approaching!

26

00:02:50,370 --> 00:03:02,470

It's in Oxford, UK this year from September 9 to 13, and it's shaping up to be an

incredible event for anybody interested in statistical modeling and Vagans in France.

27

00:03:02,470 --> 00:03:10,590

Actually, we're currently looking for sponsors to help us offer more scholarships and make

STANCON more accessible to everyone.

28

00:03:10,590 --> 00:03:11,766

And we

29

00:03:11,766 --> 00:03:15,147

encourage you to buy your tickets as soon as possible.

30

00:03:15,147 --> 00:03:23,250

Not only will this help with making a better conference, but this will also support our

scholarship fund.

31

00:03:23,250 --> 00:03:32,833

For more details on tickets, sponsorships or community involvement, you'll find the

Stencon website in the show notes or counting on you.

32

00:03:32,833 --> 00:03:34,694

OK, on to the show

33

00:03:39,170 --> 00:03:44,171

Mi capretto, bienvenido a Learning Basics Statistics.

34

00:03:44,772 --> 00:03:47,072

Hello Alex, muchas gracias.

35

00:03:47,072 --> 00:03:47,582

Thank you.

36

00:03:47,582 --> 00:03:49,973

Yeah, thanks a lot for taking the time.

37

00:03:50,533 --> 00:03:56,275

That's actually a bit weird to talk to you in Spanish in English now because we're still

talking Spanish.

38

00:03:56,275 --> 00:03:57,435

Yeah.

39

00:03:57,435 --> 00:04:03,356

for the benefit of the world, we're gonna do that in English.

40

00:04:03,817 --> 00:04:04,637

So it's awesome.

41

00:04:04,637 --> 00:04:08,858

I'm really happy to have you on the show because with

42

00:04:09,618 --> 00:04:13,521

You started as a colleague with the gears now.

43

00:04:14,102 --> 00:04:17,945

You're definitely a friend, or at least I consider you a friend.

44

00:04:19,207 --> 00:04:25,151

I will tell you after the recording if I consider you friend, depending on how it goes.

45

00:04:25,512 --> 00:04:27,714

That's smart move, smart move.

46

00:04:27,714 --> 00:04:31,917

I've lost quite a few friends because of my editing skills.

47

00:04:33,619 --> 00:04:37,221

Yeah, so I mean, it's a long

48

00:04:37,780 --> 00:04:50,371

interview I've had a lot of people who say you should have Tomica Preto on the show and I

always answered yeah I'll come to the show very soon don't worry we're finishing working

49

00:04:50,371 --> 00:05:06,714

on a project right now together so well I'll invite him at that point so that he can talk

about the project and you guys maybe know what the project is about but you'll see at

50

00:05:07,094 --> 00:05:09,776

at the middle of the episode, more or less, people.

51

00:05:09,776 --> 00:05:16,152

But I mean, if you listen to the show regularly, you know which project I'm talking about.

52

00:05:16,152 --> 00:05:20,725

But first, Tommy, we'll talk a bit about you.

53

00:05:21,406 --> 00:05:28,492

Yeah, basically, can you tell people what you're doing nowadays?

54

00:05:28,793 --> 00:05:32,936

You know, and yeah, like, what do your days look like?

55

00:05:36,320 --> 00:05:42,292

So I'm doing quite a lot of things regularly.

56

00:05:42,493 --> 00:05:48,935

Mainly, I work at Pimes Loves with a great team.

57

00:05:48,935 --> 00:05:53,977

We work in very interesting projects doing basin stats.

58

00:05:54,438 --> 00:06:02,121

I have the pleasure to be working with the people making the tool that I love.

59

00:06:02,921 --> 00:06:04,181

That's amazing.

60

00:06:05,430 --> 00:06:16,579

It's also great, I don't know, when we are working on a project, we realize time -seen is

to be able to do something or there's something broken.

61

00:06:16,800 --> 00:06:22,043

We are not wondering, is this going to be fixed at some point in time?

62

00:06:22,825 --> 00:06:24,726

Are the developers working on it?

63

00:06:24,726 --> 00:06:27,428

We can just go and change the things.

64

00:06:29,130 --> 00:06:34,173

Well, we have to be responsible because otherwise the community will hate us

65

00:06:34,306 --> 00:06:40,828

changing the things all the time, but I definitely really like it.

66

00:06:40,828 --> 00:06:44,570

So I work at Pimesy Labs, it's my main job.

67

00:06:45,290 --> 00:06:49,832

I've been at Labs for around three years, I think.

68

00:06:51,632 --> 00:06:56,534

In parallel, I also teach in university here in Argentina.

69

00:06:56,534 --> 00:07:04,341

I live in Rosario, Argentina, which is like the third largest city.

70

00:07:04,341 --> 00:07:04,871

in the country.

71

00:07:04,871 --> 00:07:09,622

After, so far, these nerds don't know Argentina.

72

00:07:09,622 --> 00:07:15,444

We'll see if I know Argentina in well enough after Buenos Aires, of course, and Córdoba.

73

00:07:15,444 --> 00:07:18,755

Yeah, I think that's the correct order.

74

00:07:18,755 --> 00:07:27,747

And of course, for the football fans, the city of Angel Di Maria and Lyon ABC, of course.

75

00:07:27,747 --> 00:07:29,107

Yeah, correct.

76

00:07:29,868 --> 00:07:34,579

And for some niche fans of football,

77

00:07:34,593 --> 00:07:45,002

Like if you are from the UK or from some very particular area of Spain Also Marcelo

Gielsa, which is a coach A very particular coach He is also from I didn't know he was from

78

00:07:45,002 --> 00:07:58,052

Rosario too, ok Yeah, yeah, yeah, we have very particular characters in the city Yeah, now

I understand why he's called El Loco Ok, ok Yeah, yeah That's how we call Tommy inside

79

00:07:58,052 --> 00:08:04,009

Pimcy Labs You are not supposed to tell that to people yeah, right, ooh, I'm sorry

80

00:08:04,009 --> 00:08:05,950

I'm not supposed to say a lie to you.

81

00:08:06,091 --> 00:08:10,213

On the show I can't.

82

00:08:10,414 --> 00:08:13,015

Yeah, and so yeah, I live here in Rosario.

83

00:08:13,135 --> 00:08:16,497

In Rosario, I don't know why I'm telling that in English.

84

00:08:18,099 --> 00:08:21,380

I teach in our national university.

85

00:08:21,501 --> 00:08:28,505

There's a program in statistics, which is a program where I studied.

86

00:08:28,666 --> 00:08:31,627

Now I'm teaching also based in statistics.

87

00:08:31,728 --> 00:08:33,581

There's a whole course

88

00:08:33,581 --> 00:08:38,441

dedicated to Basin Statistics in the final year of the career.

89

00:08:38,441 --> 00:08:40,301

It's a new course.

90

00:08:40,901 --> 00:08:42,981

It started in 2023.

91

00:08:42,981 --> 00:08:44,621

That was the first edition.

92

00:08:44,621 --> 00:08:47,941

Now we are finishing the second edition.

93

00:08:49,221 --> 00:09:01,621

The students hate and love us at the same time because we make them work a lot, but at the

end of the day they learn or at least that's what they say.

94

00:09:02,603 --> 00:09:05,775

what we find in the things that they present.

95

00:09:07,157 --> 00:09:10,299

So yeah, those are my two main activities today.

96

00:09:10,300 --> 00:09:19,047

I'm also an open source developer contributing mainly to Bambi, Pimc, Arvies.

97

00:09:19,047 --> 00:09:28,275

Sometimes I am creating a random repository to play with something or some educational

tool.

98

00:09:30,037 --> 00:09:30,591

Yeah.

99

00:09:30,591 --> 00:09:33,684

And from time to time I teach courses.

100

00:09:33,684 --> 00:09:39,188

I've just finalized teaching a Python course.

101

00:09:40,329 --> 00:09:53,620

But yeah, it's like a mixture between statistics, computers, basic statistics, Python,

also R, which was my first language.

102

00:09:53,721 --> 00:09:57,364

And yeah, that's the world we're living in.

103

00:09:57,364 --> 00:09:59,005

Yeah, yeah, definitely.

104

00:09:59,425 --> 00:10:01,036

You do a lot of things for sure.

105

00:10:01,036 --> 00:10:10,930

Yeah, I think we can go in different directions, but I'm actually curious if you can talk

about...

106

00:10:10,930 --> 00:10:19,113

I know you have an exercise in your class where you teach patient base and stance, and you

introduce them with &Ms.

107

00:10:19,833 --> 00:10:20,954

yes!

108

00:10:20,994 --> 00:10:23,255

Can you talk a bit about that exercise on the show?

109

00:10:23,255 --> 00:10:26,476

I think it will be interesting for our listeners.

110

00:10:26,696 --> 00:10:28,157

yeah, yeah, definitely.

111

00:10:28,749 --> 00:10:33,652

To be completely honest and fair, is not our idea.

112

00:10:33,652 --> 00:10:39,075

I mean, it's an idea that was actually published on a paper.

113

00:10:39,075 --> 00:10:42,346

I don't remember the name of the paper, but I'm gonna find it.

114

00:10:42,346 --> 00:10:43,817

I have it.

115

00:10:43,857 --> 00:10:50,480

And I'm gonna give you the real source of the game.

116

00:10:52,622 --> 00:10:54,503

But we have adapted that.

117

00:10:54,883 --> 00:10:58,945

Basically, the first day you enter

118

00:10:58,985 --> 00:11:07,512

a base in classroom, the teachers present you a problem saying, hey, something happened

with MMMs.

119

00:11:07,512 --> 00:11:12,716

In our case, we used the local version, which are called Rocklets.

120

00:11:13,237 --> 00:11:14,578

It's basically the same.

121

00:11:14,578 --> 00:11:16,639

It's chocolate, different colors.

122

00:11:16,880 --> 00:11:28,299

And we tell them, hey, the owner of the factory suspects that there's something happening

with the machine that creates

123

00:11:29,409 --> 00:11:36,691

the MMMs of a particular color and you need to figure out what's happening.

124

00:11:36,811 --> 00:11:50,305

And so we give them, so we divide the students in groups, we give them a bag to the

different groups and they have to open the bag, they have to count the number of pieces

125

00:11:50,305 --> 00:11:52,956

that they have of the different colors.

126

00:11:53,036 --> 00:11:57,777

At that point, the students realize that what they care about is whether it

127

00:11:57,939 --> 00:12:17,482

that particular color or not and the idea is to start thinking like in a statistical plus

basin way like what is the quantity we are trying to estimate or what is the quantity that

128

00:12:17,482 --> 00:12:27,539

will tell us the answer and then you say okay we are talking about a proportion all right

and do we know anything about that proportion?

129

00:12:27,757 --> 00:12:28,917

Well, it's a proportion.

130

00:12:28,917 --> 00:12:30,607

It can be between 0 and 1.

131

00:12:30,607 --> 00:12:32,797

It's a continuous quantity.

132

00:12:32,837 --> 00:12:39,957

And then, okay, we are going to work manually, so let's discretize that proportion.

133

00:12:39,957 --> 00:12:43,897

And we have 11 values from 0 to 1.

134

00:12:44,097 --> 00:12:49,617

And then, okay, what else do we know about that proportion?

135

00:12:50,177 --> 00:12:53,337

Are all the values equally likely?

136

00:12:54,137 --> 00:12:57,983

And you can notice that we are starting to build a prior.

137

00:12:58,157 --> 00:13:02,697

And students are like, no, we have five colors.

138

00:13:04,297 --> 00:13:15,637

The probability of this color being present 80 % of the time is not the same as the

probability of this color being present 20 % of the time, for example.

139

00:13:15,637 --> 00:13:28,323

And so we start like in a very manual way to build a probability distribution, which is

the prior for the proportion of items that are of that

140

00:13:28,865 --> 00:13:35,987

And then we say, okay, what's the kind of the data that we are collecting?

141

00:13:36,007 --> 00:13:39,328

And we end up saying, okay, this is a binomial experiment.

142

00:13:39,328 --> 00:13:45,070

And we talk about the different assumptions, independence, constant probability.

143

00:13:45,210 --> 00:13:48,281

And then, okay, how can we combine this information together?

144

00:13:48,281 --> 00:13:54,212

And we naturally talk about the Bayesian theorem.

145

00:13:56,453 --> 00:14:17,475

And yeah, we do all the math by hand with very simple numbers, but in a very intuitive way

with a problem that is interesting for students because they know those chocolates, they

146

00:14:17,475 --> 00:14:24,171

can feel it makes sense to put what they know about the problem into a probability

distribution.

147

00:14:24,171 --> 00:14:27,183

because they know that they know something about the problem.

148

00:14:27,804 --> 00:14:37,330

And doing some very simple math using probability rules that they already know, we can

arrive a solution in a basic way.

149

00:14:37,450 --> 00:14:46,496

And the end of that lesson is, okay, everything we did so far is what we are going to do

in this course.

150

00:14:46,496 --> 00:14:52,260

Like we are going to learn more about this approach to do statistics.

151

00:14:52,761 --> 00:14:53,707

And yeah.

152

00:14:53,707 --> 00:14:58,459

In the very end, they can eat the data, basically.

153

00:14:58,560 --> 00:15:01,361

And that's really interesting.

154

00:15:02,422 --> 00:15:07,264

In the very first edition, we used Rocklets, which are like &M's.

155

00:15:07,605 --> 00:15:11,666

And in the second edition, we used Gummy Bears.

156

00:15:12,487 --> 00:15:17,150

But the logic was more or less the same, but we changed the product.

157

00:15:17,150 --> 00:15:23,149

And I don't know what you're going to do in the next edition, but it will have some

158

00:15:23,149 --> 00:15:24,369

involved.

159

00:15:25,849 --> 00:15:39,968

It's definitely very interesting and I'm fascinated by these approaches to introduce stats

to people which are more intuitive.

160

00:15:40,629 --> 00:15:44,469

The student is involved in the problem from the very beginning.

161

00:15:44,469 --> 00:15:51,689

You don't start with a list of 10 abstract concepts

162

00:15:52,043 --> 00:15:55,655

Perhaps they know how to follow, but it's less attractive.

163

00:15:55,655 --> 00:15:59,517

So yeah, we do that and I really like that approach.

164

00:15:59,837 --> 00:16:01,258

Yeah, yeah.

165

00:16:01,258 --> 00:16:02,989

I mean, that's definitely super fun.

166

00:16:02,989 --> 00:16:06,701

That's why I want you to do that on the show.

167

00:16:06,701 --> 00:16:13,844

think it's a great way to stance and we'll definitely add that to the show notes as you

were saying.

168

00:16:14,945 --> 00:16:21,008

And for next year, well, I think you definitely should do that with Alpha Chores.

169

00:16:22,557 --> 00:16:24,858

Let's see if we have the budget to do that.

170

00:16:24,938 --> 00:16:27,189

Yeah, it's gonna be a bit more budget, yeah for sure.

171

00:16:27,189 --> 00:16:32,141

mean, the best would be with empanadas, but that should be not very...

172

00:16:32,141 --> 00:16:35,983

that shouldn't be very easy to do, you know, like the empanada can break.

173

00:16:35,983 --> 00:16:37,824

Nah, it's gonna be a whole mess.

174

00:16:37,824 --> 00:16:38,214

you know...

175

00:16:38,214 --> 00:16:48,368

Yeah, I know, and that usually happens like early in the morning, so the students will be

like, what are we doing here?

176

00:16:49,809 --> 00:16:51,089

Yeah, it's

177

00:16:51,093 --> 00:16:55,155

It's a nice confusion because it creates a nice, an impact.

178

00:16:55,155 --> 00:17:01,027

Like they enter the classroom and instead of having people saying, Hey, this is my name.

179

00:17:01,377 --> 00:17:02,358

we are going to work on that.

180

00:17:02,358 --> 00:17:05,279

It's like, Hey, you have this problem.

181

00:17:05,279 --> 00:17:07,239

Take some gummy bears.

182

00:17:07,239 --> 00:17:08,320

And they're like, what?

183

00:17:08,320 --> 00:17:09,540

What's happening?

184

00:17:09,721 --> 00:17:11,941

So that's, it's attractive.

185

00:17:12,342 --> 00:17:12,672

Yeah.

186

00:17:12,672 --> 00:17:12,902

Yeah.

187

00:17:12,902 --> 00:17:14,062

No, for sure.

188

00:17:14,883 --> 00:17:20,545

most of your students are like, do they already know about stance?

189

00:17:20,661 --> 00:17:20,891

Yes.

190

00:17:20,891 --> 00:17:23,422

you're teaching them the Beijing way?

191

00:17:23,663 --> 00:17:25,764

Yeah, yeah, so at that point...

192

00:17:25,764 --> 00:17:30,356

What's their most, you know, what's the most confusing part to them?

193

00:17:30,356 --> 00:17:36,449

How do they react to that new framework?

194

00:17:36,589 --> 00:17:42,833

I would say in general, we had good experiences, especially at the end of the journey.

195

00:17:44,394 --> 00:17:49,741

But in the very beginning, so when they start the course...

196

00:17:49,741 --> 00:18:04,381

They already have like 20 courses, let's say 15 because other courses are focused on

mathematics or programming, but they already have like 15 courses about statistics, but

197

00:18:04,381 --> 00:18:08,941

they are all about the non -basin approach.

198

00:18:09,721 --> 00:18:11,701

So frequentist approach.

199

00:18:11,721 --> 00:18:18,221

They know a lot about maximum likelihood estimation and all the properties.

200

00:18:19,467 --> 00:18:30,020

At that point, they already spent hours writing mathematical formulas and demonstrating

results and all that.

201

00:18:30,100 --> 00:18:38,222

But they are very new to Bayesian statistics, because all they know about Bayes is Bayes

rules.

202

00:18:38,222 --> 00:18:40,663

That's the only thing they know.

203

00:18:40,663 --> 00:18:48,865

And they also know there's an estimation method called the Bayesian method, but

204

00:18:49,601 --> 00:18:51,842

they are not using that at that point.

205

00:18:52,083 --> 00:19:08,957

And one thing that there may be other things, but one thing that takes some time for them

to adapt is, okay, parameters are not fixed anymore.

206

00:19:09,118 --> 00:19:17,345

And I put a probability distribution on top of that because in all the courses they took

before our course,

207

00:19:17,613 --> 00:19:24,757

there's a lot of emphasis on how to interpret confidence intervals, p -values and

classical statistics.

208

00:19:25,157 --> 00:19:38,324

At that point, they are not the typical student that is confused about interpreting

confidence intervals, p -values and frequency stats because they practice that a lot.

209

00:19:39,085 --> 00:19:45,628

But then it's hard for them to switch from parameters are fixed

210

00:19:47,438 --> 00:19:58,582

our interval either contains the parameter or not, but we don't know it, to, parameters

are random quantities and we put probability distributions on top of them.

211

00:20:00,643 --> 00:20:05,004

So there's a cost there, which is not huge.

212

00:20:05,425 --> 00:20:14,929

And what was really nice for us, Monte Carlo is something that really helped us from very

early we start

213

00:20:15,607 --> 00:20:26,424

computing quantities of interest with Monte Carlo, when they realize the power in that

approach, they're like, I really like this.

214

00:20:27,605 --> 00:20:40,534

Because I have a probability distribution and I'm interested in this particular

probability, or I'm interested in a probability involving two random variables, or in many

215

00:20:40,534 --> 00:20:40,895

things.

216

00:20:40,895 --> 00:20:44,477

Once they discover how powerful that approach

217

00:20:46,023 --> 00:20:48,471

They're like, this is really nice.

218

00:20:51,213 --> 00:20:56,217

But yeah, it's a challenge, but I really like it.

219

00:20:56,217 --> 00:21:02,922

And I think at the end of the day, they also like it and they see the power in the

approach.

220

00:21:03,803 --> 00:21:12,370

In fact, I have a student that's right now working on a Google Summer of Code project with

Bambi.

221

00:21:12,370 --> 00:21:13,851

So it's based in stats.

222

00:21:13,992 --> 00:21:20,781

And it seems I'm going to have another student working on a hierarchical model for his

223

00:21:20,781 --> 00:21:24,902

So yeah, it's really nice.

224

00:21:25,042 --> 00:21:26,903

Nice, yeah, yeah, for sure.

225

00:21:27,943 --> 00:21:29,563

Who is the...

226

00:21:29,684 --> 00:21:33,624

So I know also, I think if I remember correctly, there is...

227

00:21:33,845 --> 00:21:37,486

So you know Gabriel, who works on BEMI.

228

00:21:37,486 --> 00:21:39,666

I don't remember his last name right now, do you?

229

00:21:39,666 --> 00:21:43,187

It's hard, it's Gabriel Stech -Schulte.

230

00:21:43,187 --> 00:21:44,448

I don't know...

231

00:21:44,448 --> 00:21:45,908

yes, something like that.

232

00:21:45,908 --> 00:21:47,208

So sorry, Gabriel.

233

00:21:47,208 --> 00:21:49,919

But Gabriel is also a patron of the show.

234

00:21:49,981 --> 00:21:54,144

of Learn Based Stats, so he's really in the Bayesian state of mind.

235

00:21:54,205 --> 00:22:03,193

Thank you so much Gabriel for all the support to Learn Based Stats, but also, and even

more importantly, the work you do on Bambi.

236

00:22:03,193 --> 00:22:14,642

I know you've helped me a few months ago on a PR for HSGP, where I was testing Bambi's

HSGP capabilities to the limit.

237

00:22:17,197 --> 00:22:26,142

Thank you so much, Gabriel and Tony, of course, for developing Bambi all the time and

pushing the boundaries on that.

238

00:22:27,243 --> 00:22:29,144

I know Gabriel.

239

00:22:29,144 --> 00:22:34,706

So he was working in the industry and now he's back to academia, but in a more research

role.

240

00:22:34,727 --> 00:22:42,831

And sorry, Gabriel, I don't remember all the details about this, but I do remember he was

doing something very cool, applying Basin stats.

241

00:22:42,831 --> 00:22:46,775

So I'm like nudging

242

00:22:46,775 --> 00:22:51,907

publicly to someday tell the world about what he does.

243

00:22:51,907 --> 00:22:55,928

Because I remember being like, this is quite interesting.

244

00:22:57,289 --> 00:22:57,850

So yeah.

245

00:22:57,850 --> 00:22:58,570

Definitely.

246

00:22:58,570 --> 00:22:59,970

Yeah, for sure.

247

00:23:00,210 --> 00:23:01,391

Yeah.

248

00:23:01,391 --> 00:23:04,072

Actually, let's talk about Bambi.

249

00:23:04,072 --> 00:23:06,803

I think it's going to be very interesting to listeners.

250

00:23:06,803 --> 00:23:16,857

So yeah, can you tell us what Bambi is about basically and why would people

251

00:23:17,421 --> 00:23:18,361

use it.

252

00:23:18,361 --> 00:23:26,181

The way I usually do that is at least people know or I tell them it's like BRMS in Python.

253

00:23:26,181 --> 00:23:35,281

If you're interested in BRMS and don't know what that is, I think it's episode 35 with

Paul Berkner, he was on the show, I put that in the show notes.

254

00:23:35,321 --> 00:23:44,711

But if you want now Tommy's definition of Bambi, so one of the main core devs of Bambi,

well here it is folks.

255

00:23:45,901 --> 00:23:59,941

To be honest, your definition was already really good because it's one of the definitions

I usually give when I know the other party knows about VRMS.

256

00:24:01,281 --> 00:24:14,761

basically, if you don't know R, I can tell you like in 30 seconds, R has a very particular

syntax to specify regression models.

257

00:24:16,215 --> 00:24:24,447

where you basically say, okay, this is my outcome variable, use a symbol, which is a

tilde, and you say, these are my predictors.

258

00:24:24,447 --> 00:24:30,189

And you pass that to a function together with a data frame, which is a very convenient

structure.

259

00:24:30,549 --> 00:24:43,292

And that function knows how to map the names of the predictors to parameters and variables

in the model.

260

00:24:43,292 --> 00:24:46,123

It knows how to take a model formula

261

00:24:47,093 --> 00:24:57,799

and a data frame and some other information that's not always needed, and it constructs a

model with that information.

262

00:24:58,079 --> 00:25:03,403

So that's like very built in into R.

263

00:25:03,403 --> 00:25:09,105

Like if you go back to, I think to the S language, the formula syntax already existed.

264

00:25:10,086 --> 00:25:15,009

Then the R language has the formula syntax in the base packages.

265

00:25:15,923 --> 00:25:25,601

And a lot of packages built by people in R use the formula syntax to specify regression

models.

266

00:25:26,222 --> 00:25:43,856

And a lot of people also extended the formula syntax to account for other things, like one

extension that we incorporated in Bambi is the syntax to have what in frequency stats you

267

00:25:43,856 --> 00:25:45,377

call random effects.

268

00:25:47,646 --> 00:26:02,210

that appeared I think the first time in the LME4 package which is a very popular package

in R to work with mixed effects model which is another name for hierarchical models it's

269

00:26:02,210 --> 00:26:12,693

crazy how many names you have for that so basically in R you have this formula syntax and

this very short way of writing a statistical model

270

00:26:12,693 --> 00:26:18,658

and lot of people created a lot of packages to have a larger variety of models.

271

00:26:20,020 --> 00:26:20,809

Then go to Python.

272

00:26:20,809 --> 00:26:22,241

Let's go to Python.

273

00:26:22,722 --> 00:26:26,104

Python is a more general programming language.

274

00:26:26,866 --> 00:26:32,350

It has great support for statistics, machine learning, basic stats, and all that.

275

00:26:32,410 --> 00:26:38,735

But you don't have something like a model formula built in the language.

276

00:26:41,291 --> 00:26:53,587

I think one of the very first attempts to build that, which was extremely successful, it's

Patsy, which is a library developed by...

277

00:26:54,508 --> 00:26:56,849

I don't remember the name of the guy, sorry.

278

00:26:57,090 --> 00:27:01,632

I think it's Nathaniel, but I don't remember the last name.

279

00:27:01,632 --> 00:27:03,613

But that's like...

280

00:27:05,153 --> 00:27:16,489

As far as I know, the first package and the largest package that brought the model

formulas to Python, and then other libraries started to build on top of that Patsy

281

00:27:16,489 --> 00:27:17,419

library.

282

00:27:17,720 --> 00:27:20,021

For example, stats models.

283

00:27:20,061 --> 00:27:32,728

And stats models allows you not to copy and paste your R code, but basically to say, this

is in R how I will create a linear regression model.

284

00:27:32,728 --> 00:27:35,329

Okay, in Python, what do I need to

285

00:27:35,423 --> 00:27:41,896

Okay, I need a pandas data frame, model formula that it passed in a string and it works

the same way.

286

00:27:42,697 --> 00:27:54,144

And so as it happened in R with people creating packages to extend those capabilities, the

same happened in Python.

287

00:27:54,144 --> 00:28:00,027

Like you have stats models, which is very popular, but there are also many other

libraries.

288

00:28:00,027 --> 00:28:03,449

And one of those libraries is Bambi, which extends

289

00:28:03,801 --> 00:28:08,384

the model formula and uses the model formula in a basin context.

290

00:28:09,505 --> 00:28:14,809

BAMB is stands for basin model building interface.

291

00:28:15,790 --> 00:28:25,176

It uses a model formula and a syntax very similar to the syntax that you find in R to

create basin models.

292

00:28:28,205 --> 00:28:37,830

I think what's great about it is that you're not only creating the model, but you also

have lot of functionalities to work with the model.

293

00:28:37,830 --> 00:28:52,358

For example, obtain predictions, which is not trivial in many cases, or compute some

summary of interest, or help you to find prayers that are sensible for the problem that

294

00:28:52,358 --> 00:28:53,259

you have.

295

00:28:53,899 --> 00:28:56,941

And so yeah, I joined.

296

00:28:56,941 --> 00:29:16,821

the Bambi project, I think it was in 2020 or 2021, while working with Osvaldo, he was my

director in Conicet, which is like a national institute for science and technology here in

297

00:29:16,821 --> 00:29:17,841

Argentina.

298

00:29:19,321 --> 00:29:26,217

yeah, and I really liked the interface and I saw many points that could be

299

00:29:26,217 --> 00:29:35,860

improved, mainly that Bambi didn't support the syntax for random effects.

300

00:29:36,740 --> 00:29:43,162

Actually, no Python library supported that because Patsy didn't support that.

301

00:29:43,962 --> 00:29:54,045

And at that point in time, I was learning about programming languages and I was like,

well, maybe it's time to write a parser for model formulas.

302

00:29:54,085 --> 00:29:55,403

And that's what I did.

303

00:29:55,403 --> 00:30:02,337

And that was my first big contribution to Bambi.

304

00:30:06,519 --> 00:30:11,172

And then we started to add, I don't know, more model families.

305

00:30:11,172 --> 00:30:16,184

So Bambi now supports many more likelihood functions.

306

00:30:16,585 --> 00:30:24,169

We started to add better default priors because the goal of these libraries is to allow

you to

307

00:30:24,245 --> 00:30:25,885

a quick iteration.

308

00:30:26,626 --> 00:30:32,398

It's not that we are rooting for, you should all use default priors and automatic priors.

309

00:30:32,398 --> 00:30:34,129

No, please don't do that.

310

00:30:34,849 --> 00:30:41,552

But if you want to have something quick and iterate quick, then that's not a bad idea.

311

00:30:41,552 --> 00:30:53,429

Once you more or less have like a more refined idea of your model, then you can sit down

and say, okay, let's really think really.

312

00:30:53,429 --> 00:30:54,730

about the priors.

313

00:30:55,531 --> 00:31:01,714

So to summarize Bambi is a package built on top of PyMC.

314

00:31:01,715 --> 00:31:03,435

I didn't mention that before.

315

00:31:05,257 --> 00:31:20,087

That allows people to write, fit and work with base models in Python without having to

write a model in a probabilistic programming language.

316

00:31:21,468 --> 00:31:23,265

There's a trade -off.

317

00:31:23,265 --> 00:31:29,948

Like you can write a very complex model in two or three lines of code.

318

00:31:30,069 --> 00:31:35,031

If you want full flexibility, you should use a PIMC.

319

00:31:37,553 --> 00:31:44,917

And to conclude, said BAMBEE is the BRMS of Python.

320

00:31:46,198 --> 00:31:52,721

We always take like BRMS as an inspiration and also as

321

00:31:53,291 --> 00:32:10,052

Yeah, what we want to have in many cases because implementing Bambi, I learned a lot about

BRMS and how great it is actually because the complexities it can handle and the variety

322

00:32:10,052 --> 00:32:17,177

of models and kind of things you can have in a model in BRMS is huge.

323

00:32:17,418 --> 00:32:22,923

I mean, I'm not aware of any other interface like this that supports

324

00:32:22,923 --> 00:32:25,874

as many things, base and non -base.

325

00:32:25,874 --> 00:32:29,515

I mean, it's really amazing.

326

00:32:30,995 --> 00:32:38,477

And yeah, we are always taking ideas from VRMS.

327

00:32:39,037 --> 00:32:42,198

Yeah, Yeah, great, Samari.

328

00:32:42,198 --> 00:32:43,018

Thanks to me.

329

00:32:43,018 --> 00:32:46,279

And even like, brief history of Bambi, I love that.

330

00:32:46,659 --> 00:32:52,061

So in the show notes, I added the link to

331

00:32:52,939 --> 00:33:03,156

that you mentioned and also the link to the very first Learn Bay Stats episode which was

with Osvaldo Maldini.

332

00:33:03,156 --> 00:33:05,338

So it was episode number one.

333

00:33:05,338 --> 00:33:08,370

It's definitely a vintage one, people.

334

00:33:08,370 --> 00:33:09,559

Feel free to...

335

00:33:09,560 --> 00:33:11,041

I have a fun story about that.

336

00:33:11,041 --> 00:33:11,882

yeah?

337

00:33:13,983 --> 00:33:17,716

I don't know if I told you about this story but when Osvaldo recorded that...

338

00:33:17,716 --> 00:33:18,866

think I know.

339

00:33:18,927 --> 00:33:19,947

Yeah, you know, know.

340

00:33:19,947 --> 00:33:20,358

When Osvaldo...

341

00:33:20,358 --> 00:33:22,379

but I don't know if the public know

342

00:33:22,379 --> 00:33:23,969

knows about that story.

343

00:33:24,930 --> 00:33:35,056

So Osvaldo and I used to work like in the same building, not in the exact same office, but

his office was in front of my office.

344

00:33:35,056 --> 00:33:38,618

So if he was talking to someone, I could listen.

345

00:33:38,818 --> 00:33:43,341

Not very clearly, but I could realize he was talking.

346

00:33:43,341 --> 00:33:51,211

And some random day I was in the office and I noticed that he was talking English, but

alone.

347

00:33:51,211 --> 00:33:52,982

Like, not with another person.

348

00:33:53,042 --> 00:33:55,283

And I said, what is he doing?

349

00:33:55,403 --> 00:34:05,509

And then after that, he told me, yes, I was interviewed in a podcast that this other guy

who's been contributing to Arby's is starting.

350

00:34:05,509 --> 00:34:07,400

And yeah, I think it's very cool.

351

00:34:07,400 --> 00:34:09,811

I think it went very well.

352

00:34:10,151 --> 00:34:20,917

And at that point in time, I didn't know you, but I knew there was a podcast guy and it

turns out that I witnessed

353

00:34:21,213 --> 00:34:26,735

the first recording of Learned Basics Statistics, which is pretty fun.

354

00:34:26,836 --> 00:34:30,157

And look where we are now.

355

00:34:30,477 --> 00:34:31,798

Pretty interesting.

356

00:34:32,218 --> 00:34:33,478

Yeah, this is really cool.

357

00:34:33,478 --> 00:34:34,999

I love that story.

358

00:34:35,039 --> 00:34:38,070

It was already all linked together.

359

00:34:38,070 --> 00:34:40,121

I love that.

360

00:34:40,121 --> 00:34:42,082

Yeah.

361

00:34:42,502 --> 00:34:44,383

Yeah.

362

00:34:44,923 --> 00:34:49,045

I really love Bendy for what you said, you just said, I think.

363

00:34:49,215 --> 00:34:56,011

It's a great way to start and iterate very fast on the model.

364

00:34:56,011 --> 00:35:06,880

And then if you validate the concept, then you can switch to PIMC and build the model

again, but then build on top of that.

365

00:35:06,880 --> 00:35:11,203

And that's going to make all your modeling workflow way faster.

366

00:35:11,284 --> 00:35:11,664

Yeah.

367

00:35:11,664 --> 00:35:12,745

really love that.

368

00:35:12,745 --> 00:35:17,348

Another thing also that's really good is for teaching, especially beginners,

369

00:35:17,889 --> 00:35:22,190

that will abstract away a lot of the choices that need to be made in the model.

370

00:35:22,190 --> 00:35:32,533

As you were saying, it's not necessarily what you want to do all the time, but at least to

start with, you know, it's like when you start learning a new sport.

371

00:35:32,533 --> 00:35:42,116

Yes, there are tons of nuances to learn, but, you know, if you focus on one or two things,

you already have the Pareto effect.

372

00:35:42,116 --> 00:35:47,697

Well, then Bambi allows you to do that, and I think that's extremely valuable.

373

00:35:48,001 --> 00:35:56,067

Yeah, and another point I'm realizing I forgot to mention is that it lowers the the

entrance barrier.

374

00:35:56,067 --> 00:36:09,175

Like, there are a lot of people who are not statisticians, but they do stats because they

have experiments or they have they are studying something and they have data and they have

375

00:36:09,175 --> 00:36:15,320

some level of familiarity with some models and they know that that's the model they want

to fit.

376

00:36:15,320 --> 00:36:17,701

But probably writing PIMC

377

00:36:17,823 --> 00:36:34,482

and working with indexes and demons and quarts is too much and going to Stan and typing

everything is also too much and they don't work with R and they want some higher level

378

00:36:34,482 --> 00:36:38,384

interface to work with, then Bambi is also what they use.

379

00:36:39,585 --> 00:36:42,907

And yeah, I also really like that.

380

00:36:42,907 --> 00:36:46,948

It makes basic stats

381

00:36:48,725 --> 00:36:57,387

more welcoming for people that are not experts at writing code, which is completely fine.

382

00:36:57,387 --> 00:37:09,971

Because a lot of people out there are trying to solve already difficult problems and

adding the extra complexity of being an expert in a PPL maybe too much.

383

00:37:09,971 --> 00:37:13,171

So that's also another reason to have these interfaces.

384

00:37:13,452 --> 00:37:14,522

Yeah, yeah, yeah.

385

00:37:14,522 --> 00:37:16,632

I definitely completely agree.

386

00:37:18,675 --> 00:37:20,886

I that's also...

387

00:37:21,026 --> 00:37:34,171

So basically, if people are curious about Bambi and get started with that, I definitely

recommend taking a look at the Bambi's website that I put in the show notes.

388

00:37:35,793 --> 00:37:43,416

also, well, probably then about our new course, Tommy, that's the project that was in the

notes.

389

00:37:43,416 --> 00:37:46,605

So this is all I am happy to have you on the show here, please.

390

00:37:46,605 --> 00:37:53,165

So the course is called Advanced Regression with Bambi and Pimc.

391

00:37:53,485 --> 00:38:00,695

Precisely, it's on the intuitive -based website, so of course I put that in the show notes

for people who want to take a look at it.

392

00:38:00,695 --> 00:38:03,805

If you're a patron of the show, have 10 % off.

393

00:38:03,805 --> 00:38:10,985

This is the only discount that we do, so I hope you appreciate it.

394

00:38:10,985 --> 00:38:13,285

That's how special you are.

395

00:38:13,285 --> 00:38:15,585

Thank you so much, patrons.

396

00:38:16,371 --> 00:38:29,171

And yeah, maybe Tommy tell us about, you know, the course and what it is about and for

whom in particular that would be.

397

00:38:29,171 --> 00:38:31,133

We spent a lot of time on this course.

398

00:38:31,133 --> 00:38:33,315

It took us two years to develop.

399

00:38:33,315 --> 00:38:36,457

So, yeah, I'm super happy about it.

400

00:38:36,457 --> 00:38:39,539

I'm also super happy that it's done.

401

00:38:41,001 --> 00:38:46,357

But yeah, maybe give us the elevator pitch for the course who that before.

402

00:38:46,357 --> 00:38:48,998

and why would people even care about it?

403

00:38:50,519 --> 00:39:07,718

So the Advanced Regression Course is a very interesting course with a lot of material,

with a lot of very well thought material, which in all cases went through a lot of

404

00:39:07,718 --> 00:39:10,450

reviews.

405

00:39:10,450 --> 00:39:16,493

As the title says, it's a course about regression, but also as the title says,

406

00:39:16,521 --> 00:39:19,143

it's an advanced regression course.

407

00:39:19,163 --> 00:39:32,004

It doesn't mean it starts from the beginning being extremely advanced and it doesn't mean

it involves the craziest mathematical formulas that you're going to see in your life, but

408

00:39:32,004 --> 00:39:44,084

it means it's the course you have to take if you want to give, sorry, if you want to take

that second or third step in your learning journey.

409

00:39:46,411 --> 00:39:59,115

Like for example, if you took an introductory course like yours or another introductory

course and you feel that's not enough or you are open to learn more, you are eager to

410

00:39:59,115 --> 00:40:02,946

learn more, then that's the course for you.

411

00:40:03,046 --> 00:40:13,088

Of course, it has a base in approach and it uses a lot of Python, Bambi and Pimc.

412

00:40:15,753 --> 00:40:22,017

Every time I talk about regression, I want to qualify something.

413

00:40:23,098 --> 00:40:30,503

I remember a conversation I had with colleagues when I was just starting in a previous

job.

414

00:40:32,205 --> 00:40:42,852

They were telling me they were taking a course about statistics, like those courses where

you have a ton of topics, but only very lightly colored.

415

00:40:44,759 --> 00:40:47,531

And they were like, yeah, the first two units is regression.

416

00:40:47,531 --> 00:40:48,872

And this is a lot.

417

00:40:48,872 --> 00:40:54,216

And I was telling them, in university, I had six courses about regression.

418

00:40:54,577 --> 00:41:02,763

It was not just two units in a course.

419

00:41:03,184 --> 00:41:09,890

And that's because I think in many cases, people think that regression is something very

simple.

420

00:41:09,890 --> 00:41:14,273

It's the linear regression that you learn in

421

00:41:14,497 --> 00:41:25,183

basic statistics course, like you have a predictor and you have an outcome variable and

you have a predictor, then that's simple linear regression.

422

00:41:25,183 --> 00:41:28,745

You have multiple predictors, you have multiple linear regression.

423

00:41:28,745 --> 00:41:30,265

And that's it.

424

00:41:30,426 --> 00:41:33,107

That's all linear regression gives you.

425

00:41:33,127 --> 00:41:39,010

And all the rest are crazier things that fall under the machine learning umbrella.

426

00:41:40,751 --> 00:41:43,553

But in the course, we see that that's

427

00:41:43,783 --> 00:41:45,133

the whole story.

428

00:41:46,794 --> 00:42:02,728

So many things are regressions or if you don't like the term maybe we can give you a

better term in the future but so many things are linear models which sounds pretty basic

429

00:42:02,728 --> 00:42:03,158

right?

430

00:42:03,158 --> 00:42:13,221

You say this is a linear model this is a linear equation it's like this is for dummies but

if you're curious take the course and and you will see

431

00:42:14,529 --> 00:42:18,210

With linear models, you can do a lot of crazy things.

432

00:42:18,210 --> 00:42:23,271

Of course, we start with simple linear regression and we do multiple linear regression.

433

00:42:23,271 --> 00:42:39,436

But then very quickly, go to logistic regression, Poisson regression, we talk about

categorical regression, multinomial regression, when your outcome is categories and you

434

00:42:39,436 --> 00:42:41,156

have multiple categories.

435

00:42:41,476 --> 00:42:43,937

And then it goes crazy.

436

00:42:45,190 --> 00:43:01,124

and we have zero inflation and we have overdispersion and we finalize the course talking

about hierarchical models in the context of regressions and it ends with a very

437

00:43:01,124 --> 00:43:03,445

interesting model that you developed.

438

00:43:06,188 --> 00:43:11,853

So the course is very complete, it starts

439

00:43:13,045 --> 00:43:17,986

A few things that we assume people know but we like review them.

440

00:43:19,207 --> 00:43:23,128

But then very soon we start covering new things.

441

00:43:25,549 --> 00:43:32,350

I think in all cases we show how to do things with Bambi and how to do them with Pine T.

442

00:43:32,551 --> 00:43:34,611

We have a lot of visualizations.

443

00:43:36,052 --> 00:43:41,793

Our editor did an amazing job at editing the video so we also have animations and all

that.

444

00:43:43,482 --> 00:43:46,373

Yeah, it's a product I'm proud of.

445

00:43:46,373 --> 00:43:50,413

Yeah, it's nice.

446

00:43:50,413 --> 00:43:52,255

Yeah, definitely.

447

00:43:52,255 --> 00:43:56,857

There is so much that we've done, in this foreign territory.

448

00:43:56,857 --> 00:43:58,597

Well, I learned so much because...

449

00:43:58,597 --> 00:43:59,988

Me too.

450

00:44:00,009 --> 00:44:07,452

Yeah, as you were saying, it sounds like what a regression is, just something from the

past.

451

00:44:07,452 --> 00:44:09,932

But it's actually used all the time.

452

00:44:11,455 --> 00:44:28,219

You know, even the big LMs now, in the end, it's a lot of dot products and dot products

are matrices multiplied with vectors and, you know, a linear regression is actually not

453

00:44:28,219 --> 00:44:29,120

that far from that.

454

00:44:29,120 --> 00:44:31,241

It's actually exactly that.

455

00:44:31,261 --> 00:44:40,568

So if you learn and understand really the nitty gritty of hard regressions, complex

456

00:44:41,751 --> 00:44:47,745

you already know a lot of things you're going to need to to need.

457

00:44:47,865 --> 00:44:52,909

You're going to need to know when doing Bayesian modeling in the trenches.

458

00:44:53,409 --> 00:44:54,740

That's, that's for sure.

459

00:44:54,740 --> 00:45:02,215

And that's also why I learned so much in this course, because I had to really dig into the

regression models.

460

00:45:02,836 --> 00:45:09,741

And, we show you how to do that from simple regression to binomial regression.

461

00:45:09,741 --> 00:45:25,035

Poisson regression, stuff you guys obviously at least have heard about, but then we teach

you more niche and advanced concepts like zero inflated regressions, over dispersed

462

00:45:25,035 --> 00:45:36,909

regression, which is one of the chapters you worked on, Tommy, and you folks are gonna

learn a lot on that, like not only how to do the models, but then what to do with the

463

00:45:36,909 --> 00:45:37,811

models after.

464

00:45:37,811 --> 00:45:44,244

how to diagnose them, how to become confident about the model's predictions.

465

00:45:44,265 --> 00:45:57,552

And also we teach you about a personal favorite of mine, which is the categorical and

multinomial regressions, which I use a lot for electoral forecasting.

466

00:45:57,932 --> 00:46:06,536

But also you're going to use them a lot, for instance, for any more than two categories,

you're going to use a multinomial or a categorical.

467

00:46:07,039 --> 00:46:13,174

And that's just extremely important to know about them because they are not trivial.

468

00:46:13,174 --> 00:46:18,758

There are lot of subtleties and difficulties and we show you how to handle that.

469

00:46:19,059 --> 00:46:22,982

I that's personally, I learned so much.

470

00:46:22,982 --> 00:46:36,599

Something I really loved is what you did in in the over dispersed lesson, you know, where

you were diagnosing the over dispersion and coming up with a bunch

471

00:46:36,599 --> 00:46:41,112

custom plots to show that the model is under dispersed.

472

00:46:41,112 --> 00:46:43,153

Yeah, that's a term.

473

00:46:43,153 --> 00:46:44,713

Compared to the data.

474

00:46:44,894 --> 00:46:55,840

And also then coming up with a test statistic, a custom test statistic to actually see

whether the model is under dispersed or not.

475

00:46:55,840 --> 00:47:04,425

And I think that's really powerful because that shows you also that in the invasion

framework, I often get that question from beginners.

476

00:47:04,425 --> 00:47:05,441

can I compute

477

00:47:05,441 --> 00:47:09,183

test statistics, because that's a magic one in the fragrances framework.

478

00:47:09,183 --> 00:47:10,443

I'm like, yeah, sure.

479

00:47:10,443 --> 00:47:14,695

But you can also invent your own test statistics for your own purpose here.

480

00:47:14,695 --> 00:47:18,296

You don't have to use a pre -baked test statistic.

481

00:47:18,296 --> 00:47:19,347

You have posterior samples.

482

00:47:19,347 --> 00:47:24,088

can do whatever you want with them.

483

00:47:24,549 --> 00:47:31,011

I thought that was like, that's definitely one of my favorite parts of the course.

484

00:47:31,191 --> 00:47:35,125

And something I realized we forgot to mention, and I really like,

485

00:47:35,125 --> 00:47:48,214

about the course and I really like having that in the course is all the different parts

where we talk about parameter identifiability and overparameterization and it's like we

486

00:47:48,214 --> 00:47:55,999

don't tell you, take this outcome, take these three predictors and put them into the

machine and you're good to go.

487

00:47:57,320 --> 00:48:04,605

I think that's probably, that will be a difficult part the first time you encounter

488

00:48:04,735 --> 00:48:11,247

in the course, but we cover it multiple times in multiple lessons.

489

00:48:11,708 --> 00:48:21,932

And the reason is it's a very important topic that's covered in many places, but I think

with not enough emphasis.

490

00:48:22,472 --> 00:48:34,117

So we did our best to include that topic in many lessons to show it from different angles,

show how it can happen under

491

00:48:34,463 --> 00:48:40,377

synchro stances, and that's something I'm really proud about.

492

00:48:41,519 --> 00:48:48,905

How much time and effort we invested in non -identifiability, parameter redundancy, and

all that.

493

00:48:48,905 --> 00:48:56,831

And the different approaches to deal with that, that's something I'm proud of.

494

00:48:56,932 --> 00:48:58,693

I'm very happy we did that.

495

00:49:01,536 --> 00:49:02,186

Yeah, definitely.

496

00:49:02,186 --> 00:49:03,657

That's a very good point.

497

00:49:04,395 --> 00:49:16,093

I think I finally understand overparameterization by working on this course because we see

it from, I think from lesson two or three, up until the last lesson, which is lesson nine.

498

00:49:16,093 --> 00:49:17,574

Yes.

499

00:49:17,574 --> 00:49:19,576

And we see it repeatedly.

500

00:49:19,576 --> 00:49:26,380

And I think that's really good because it's a hard concept that's related to an

unidentifiability.

501

00:49:26,380 --> 00:49:33,265

That happens a lot in models, not only Bayesian models, all the, like any statistical

model, but it's

502

00:49:33,931 --> 00:49:37,422

mathematical thing.

503

00:49:38,063 --> 00:49:41,604

And then it appears all the time in models.

504

00:49:41,604 --> 00:49:44,916

And that's related to an identifiability, but it's hard to understand.

505

00:49:44,916 --> 00:49:50,788

So you have to repeat it and really, really understand what that means.

506

00:49:50,788 --> 00:49:55,850

then only then you can develop an intuition of what that really is and when it happens.

507

00:49:56,450 --> 00:50:03,153

So yeah, definitely that's, that's also something I personally learned a lot and enjoyed a

lot in this.

508

00:50:03,489 --> 00:50:06,131

in building this course.

509

00:50:06,131 --> 00:50:08,733

Yeah, me too.

510

00:50:08,733 --> 00:50:20,311

What would you say is your favorite part of all the curriculum right now and also what is

the part that was much more complicated than you anticipated?

511

00:50:20,311 --> 00:50:24,404

Good question.

512

00:50:29,271 --> 00:50:37,443

I don't know if this is a favorite part, but something I really like about the course is

how many visualizations we created.

513

00:50:37,463 --> 00:50:47,166

Like in every model, we always created a visualization to explore the posterior, to plot

predictions, to do things like that.

514

00:50:47,866 --> 00:50:58,149

I really like when you create a model and you don't just show two numbers, you make a

beautiful thing to communicate what you found.

515

00:50:59,927 --> 00:51:01,938

That's something I really like.

516

00:51:03,620 --> 00:51:17,681

definitely, my favorite parts are the more advanced parts, like starting perhaps in lesson

five, lesson six, when we talk about categorical regression, multinomial regression, and

517

00:51:17,681 --> 00:51:19,363

then everything that happens after that.

518

00:51:19,363 --> 00:51:25,737

Because I think that every lesson has many things to learn.

519

00:51:26,819 --> 00:51:29,861

So I couldn't say, okay, this

520

00:51:30,263 --> 00:51:50,171

the part I enjoy the most because I enjoy all of them but definitely the second half and

something that was difficult actually while working on the lesson about over dispersion I

521

00:51:50,171 --> 00:51:59,785

looked through a lot of books, papers and all that and it was not easy at all to

522

00:52:00,257 --> 00:52:13,286

many references, examples, datasets, very well worked examples from end end.

523

00:52:15,308 --> 00:52:25,034

Honestly, I thought I would find a lot more, many more resources, and it was not that

easy.

524

00:52:25,034 --> 00:52:26,835

I read papers

525

00:52:28,941 --> 00:52:31,261

from 50 years ago.

526

00:52:31,541 --> 00:52:36,001

Those scanned papers, like written in machines.

527

00:52:36,821 --> 00:52:41,201

Yeah, that was harder than what I anticipated.

528

00:52:41,261 --> 00:52:56,601

Crafting that lesson required a lot of reading, not only for the complexity, but also to

find resources that helped me build the lesson.

529

00:52:56,601 --> 00:52:58,281

Yeah, definitely that

530

00:52:58,781 --> 00:53:00,662

challenging and unanticipated.

531

00:53:00,662 --> 00:53:02,472

Yeah, that lesson was hard, for sure.

532

00:53:02,472 --> 00:53:06,503

that was difficult one.

533

00:53:06,603 --> 00:53:25,669

Yeah, I mean, for me, I think my favorite part was really, as I was saying, Not learning,

but really getting to another level of understanding of an identifiability and of

534

00:53:25,669 --> 00:53:27,029

parameterization.

535

00:53:27,753 --> 00:53:34,418

And also, the next level in my understanding of the zero -sum normal distribution.

536

00:53:35,059 --> 00:53:37,881

Because I had to use it a lot in the whole lesson.

537

00:53:38,081 --> 00:53:44,336

And so, I mean, in the lessons, in all the lessons I'm teaching in this course, so three

of them, I'm using zero -sum normal.

538

00:53:44,336 --> 00:53:46,949

So I had a really deep, deep...

539

00:53:46,949 --> 00:53:57,136

And actually, that's something that, yeah, the students have said from the beta version

that

540

00:53:58,369 --> 00:54:07,233

Yeah, it's very interesting to see how you solve one of the unidentifiability that can

happen in models.

541

00:54:07,233 --> 00:54:22,019

So like, for instance, with multinomial models, one of the probabilities, like the last

category's probability is entirely determined by the n minus one previous categories.

542

00:54:22,019 --> 00:54:26,000

So that's basically what an overparameterization is.

543

00:54:26,080 --> 00:54:28,139

If you put the parameter

544

00:54:28,139 --> 00:54:40,832

the end categories, then your model is overparameterized because the last category is

entirely determined once you know about the end minus one, the previous end minus ones.

545

00:54:41,033 --> 00:54:46,114

And so there are at least two ways to solve that as we show in the course.

546

00:54:46,174 --> 00:54:53,476

One of the classic ones, and it's the one that automatically implemented in BAMBi is

reference encoding.

547

00:54:53,476 --> 00:54:57,097

So you take one of the categories and you consider that

548

00:54:57,279 --> 00:55:03,192

is the reference in O and you fix it to an arbitrary number.

549

00:55:03,192 --> 00:55:05,913

So fix that parameter to an arbitrary number.

550

00:55:05,913 --> 00:55:07,413

Usually it's zero.

551

00:55:08,254 --> 00:55:13,756

And then all the other categories, these parameters are in reference to that category.

552

00:55:13,816 --> 00:55:25,941

So you could do that, but you can also do, and that's what we show you also in the course,

you can also say, well, instead of fixing one category to zero, I'm going to fix the

553

00:55:26,453 --> 00:55:28,314

other categories to zero.

554

00:55:28,594 --> 00:55:40,720

And that way you can still have n parameters, one for each category, which is really cool

because that way you don't have to think about one category as a reference.

555

00:55:41,361 --> 00:55:46,944

And you just use a zero for normal distribution instead of normal distribution.

556

00:55:46,944 --> 00:55:51,807

And that distribution is going to make sure that the sum of the categories sum to zero.

557

00:55:52,587 --> 00:55:55,969

So that will depend when you prefer one or the

558

00:55:56,129 --> 00:56:07,717

But usually when you don't have a natural placebo, you will probably prefer the zero

-subnormal parameterization because then there is no obvious reference.

559

00:56:07,717 --> 00:56:14,501

Whereas a placebo is an obvious reference, you probably want all the parameters in

reference to that category.

560

00:56:14,702 --> 00:56:20,406

But the zero -subnormal is going to be in reference to the average of all the categories.

561

00:56:20,406 --> 00:56:24,768

And you can actually model an average for all the categories

562

00:56:25,505 --> 00:56:31,947

this parameterization and then all the categories will be an offset of that baseline.

563

00:56:32,447 --> 00:56:42,810

So that was definitely something super interesting that helped me pass the level in my

understanding of the distribution in that course.

564

00:56:43,070 --> 00:56:47,521

And definitely a lot of better testers appreciated it.

565

00:56:47,521 --> 00:56:52,292

I guess you want to say something also, but that's only because you know the zero sum

novel quite well.

566

00:56:52,292 --> 00:56:52,942

Yeah, yeah.

567

00:56:52,942 --> 00:56:54,893

But something like

568

00:56:56,023 --> 00:56:58,780

Something nice I wanna say about the zero -sum normal.

569

00:57:02,849 --> 00:57:15,262

In PyMC, the Serious or Normal is implemented as a distribution, which I think it would be

better if we could say, okay, this is a normal distribution plus transformation or a

570

00:57:15,262 --> 00:57:16,333

restriction.

571

00:57:17,993 --> 00:57:31,237

But having something called Serious or Normal and being able to use that as problem as any

other PyMC distribution is very convenient because the user doesn't have to deal with all

572

00:57:31,237 --> 00:57:32,109

the details.

573

00:57:32,109 --> 00:57:33,770

to get that constraint.

574

00:57:34,491 --> 00:57:46,280

While if in PyMC you wanna have like other encoding, like you wanna have reference level,

you have to do it in a very manual way.

575

00:57:47,702 --> 00:57:53,286

You have to create a vector of normals with shape n minus one.

576

00:57:53,286 --> 00:57:57,750

Then you have to concatenate a serial to that other vector.

577

00:57:57,750 --> 00:58:02,183

And then you get a new vector and that's vector you use in your model.

578

00:58:03,362 --> 00:58:20,350

And you end up having like a constant in your trace and then Arvis complains about not

being able to compute our hat, for example, because they are all zeros or all constant.

579

00:58:22,132 --> 00:58:28,195

And the zeros on normal is also like more appealing for the general users.

580

00:58:28,195 --> 00:58:31,436

They just replace normal with zeros on normal.

581

00:58:33,057 --> 00:58:35,238

and you're good to go.

582

00:58:35,678 --> 00:58:38,840

That doesn't mean we shouldn't think about what we're doing.

583

00:58:38,840 --> 00:58:49,786

I'm just talking about from like user experience, it's much easier to use a

SerialSumNormal and also more intuitive in most of the cases.

584

00:58:51,227 --> 00:59:02,903

But yeah, I think the summary and how this relates to the course is think about parameter

restrictions that you add to the model.

585

00:59:02,903 --> 00:59:13,356

think about how that changes the meaning of the parameters and then be responsible with

what you do.

586

00:59:13,356 --> 00:59:19,998

But know that there's not a single recipe for solving that kind of problems.

587

00:59:20,838 --> 00:59:21,918

Yeah, yeah.

588

00:59:21,918 --> 00:59:31,201

Yeah, and that's also why we have the whole community in intuitive ways and we have the

discourse that people can ask questions because unfortunately there is no...

589

00:59:31,327 --> 00:59:32,638

one size fits all.

590

00:59:32,638 --> 00:59:40,401

I mean, I say unfortunately, that's actually pretty cool because otherwise, I guess what

we're doing would be pretty boring.

591

00:59:43,262 --> 00:59:47,583

Time is running by and I think we've covered that topic quite well.

592

00:59:47,583 --> 00:59:54,307

I I could talk about regression quite a long time, but I think that's a good overview.

593

00:59:54,307 --> 00:59:59,243

And of course, if people are interested in some of the topics we talked about here,

594

00:59:59,243 --> 01:00:08,097

Let me know and I can do a special episode about some parts of Regressions that you're

interested in or you're really wondering about.

595

01:00:08,097 --> 01:00:21,802

Or we can even do a modern webinar showing you some things, some answers to the most

frequently asked questions you have about Regressions.

596

01:00:22,122 --> 01:00:24,523

for sure, let us know about that.

597

01:00:24,523 --> 01:00:29,025

And well, if we made you curious to take the course.

598

01:00:29,101 --> 01:00:29,781

That's awesome.

599

01:00:29,781 --> 01:00:34,422

I think this will be a lot of hours well invested.

600

01:00:34,623 --> 01:00:37,283

Yeah, because it's nine lessons.

601

01:00:37,283 --> 01:00:41,344

It's, I don't know how many hours of videos, but a lot.

602

01:00:41,685 --> 01:00:44,665

You have lifetime access to that.

603

01:00:44,665 --> 01:00:47,886

have exercises, which are very important.

604

01:00:47,886 --> 01:00:58,301

Folks, I know I sound like a very old professor here, but actually I think the most

valuable of the course is not only watching the videos, but also doing the exercises.

605

01:00:58,301 --> 01:01:08,277

and going through the solutions that you have all on the repo and asking questions on the

discourse, answering questions on the discourse, being part of that community.

606

01:01:08,277 --> 01:01:22,575

Basically that's really how you're going to get the most out of yeah, like it's, you can

not learn how to ride a horse by just watching people riding horses.

607

01:01:23,075 --> 01:01:25,536

It's the same with patient modeling.

608

01:01:26,517 --> 01:01:32,489

If you just watch the videos, that will be entertaining for sure, but you're not gonna get

the most out of it.

609

01:01:32,489 --> 01:01:33,920

So, yeah.

610

01:01:33,920 --> 01:01:37,992

And if you do take the course, please say hi.

611

01:01:37,992 --> 01:01:44,444

You are gonna be very happy to have you there and definitely wanna hear from you.

612

01:01:44,905 --> 01:01:56,031

Tell me maybe, yeah, something I wanted to ask you before letting you go is, I know you've

done some work lately about sparse matrices.

613

01:01:56,031 --> 01:02:04,437

If I remember correctly, in PyTentor, is that something you think would be useful here to

share a bit for listeners?

614

01:02:04,437 --> 01:02:13,653

Yeah, yeah, can, I It's a topic I really like and I wish I knew more about that and always

like trying to learn.

615

01:02:15,094 --> 01:02:23,100

Like there's some depth at which I know nothing about how that works.

616

01:02:23,100 --> 01:02:24,460

But basically,

617

01:02:25,041 --> 01:02:29,823

You already mentioned this, many things can be expressed as dot products.

618

01:02:30,383 --> 01:02:39,627

And a subset of those many things can be expressed as a dot product between a matrix and a

vector.

619

01:02:39,848 --> 01:02:42,768

That happens all the time in linear models.

620

01:02:43,089 --> 01:02:47,230

That's basically the gist of linear model.

621

01:02:47,491 --> 01:02:52,737

And in a subset of those cases, one

622

01:02:52,737 --> 01:02:56,859

the matrix of that dot product is very sparse.

623

01:02:58,660 --> 01:02:59,700

And if it's very sparse...

624

01:02:59,700 --> 01:03:00,631

So define a sparse...

625

01:03:00,631 --> 01:03:04,222

Yeah, define a closed matrix for example.

626

01:03:05,083 --> 01:03:13,166

You have many entries in a matrix, but most of them, the great majority of them, are zero.

627

01:03:14,107 --> 01:03:19,029

So it means in the multiplication they are not going to contribute anything to the final

628

01:03:19,837 --> 01:03:34,968

If you do a dot product between a sparse matrix and a dense vector, dense is the opposite

of a sparse, meaning that you can have some zeros, but you don't have so many zeros to the

629

01:03:34,968 --> 01:03:39,911

point where non -series are the rare value.

630

01:03:40,492 --> 01:03:48,657

Anyway, if you have a big sparse matrix and a dense vector and you multiply them, you do a

dot product.

631

01:03:49,293 --> 01:04:00,118

you're going to spend a lot of time computing things that are serial and will always be

serial and contribute nothing to the end result.

632

01:04:02,199 --> 01:04:16,565

Of course there are, like, for a long time there have been structures to store these

special matrices in computers in such a way that you save space because

633

01:04:16,991 --> 01:04:26,246

If you have a huge matrix with a lot of zeros stored in a dense way, that takes memory.

634

01:04:26,766 --> 01:04:31,609

If you don't tell the computer those values are all the same, it doesn't know about that.

635

01:04:31,609 --> 01:04:35,751

So it's going to take a lot of memory to store that matrix.

636

01:04:35,751 --> 01:04:46,477

But with a sparse matrix, first you can save a lot of space into storage of the matrix.

637

01:04:46,593 --> 01:04:56,777

And then you can exploit the sparsity to do less computations.

638

01:04:56,917 --> 01:05:00,258

And at the end of the day, have computations that run faster.

639

01:05:00,879 --> 01:05:14,904

And if you are doing MCMC, which means that you are evaluating the log P and its

derivative many, many times, it means you're multiplying.

640

01:05:15,025 --> 01:05:16,305

If you're doing

641

01:05:16,341 --> 01:05:20,924

matrix and vector multiplication a lot of times.

642

01:05:20,924 --> 01:05:31,999

So gaining time, making that computation faster is something that we want to have.

643

01:05:31,999 --> 01:05:40,554

yeah, PyTensor has some support for sparse matrices and sparse objects in general.

644

01:05:41,635 --> 01:05:46,047

But as far as I know, that support comes from

645

01:05:46,047 --> 01:05:47,858

old Tiano days.

646

01:05:47,858 --> 01:05:54,599

There has been some maintenance, but not a lot of features have been added.

647

01:05:55,880 --> 01:06:07,543

And yeah, for some projects at Labs, I've been writing my custom things to do dot products

between sparse matrices and dense vectors.

648

01:06:08,323 --> 01:06:15,205

Unfortunately, I didn't have time yet to put that into PyTensor, but I want to do that

649

01:06:15,573 --> 01:06:22,956

someone wants to collaborate on that endeavor, I'm more than happy.

650

01:06:25,037 --> 01:06:28,278

But yeah, I think it's something that we should do more.

651

01:06:28,278 --> 01:06:44,481

And the main motivation was that I wanted Bambi to do that by default, because Bambi is

doing the simple thing of multiplying big dense matrices.

652

01:06:44,481 --> 01:06:47,583

when some of those matrices could have been sparse.

653

01:06:49,345 --> 01:07:07,360

It's definitely not like new theory or new computational techniques, but it's taking

things that already exist and making them usable, first available and then usable for the

654

01:07:07,360 --> 01:07:08,661

wider community.

655

01:07:08,681 --> 01:07:13,355

And I don't know, I have fun doing those kinds of things.

656

01:07:13,841 --> 01:07:17,154

Yeah, I mean, I think this is extremely valuable.

657

01:07:17,154 --> 01:07:23,228

I hope you'll have time to include that in Python.

658

01:07:23,228 --> 01:07:24,519

In a few weeks or months.

659

01:07:24,519 --> 01:07:29,803

I mean, if I had time, but I definitely helped you, Matt.

660

01:07:29,803 --> 01:07:33,115

Unfortunately, now with the new job and the other projects that have

661

01:07:42,091 --> 01:07:46,645

to finish, like, don't have a lot of time for that.

662

01:07:46,645 --> 01:07:53,950

yeah, but I mean, this is also definitely something that I want to learn more about

because it happens quite a lot.

663

01:07:54,211 --> 01:07:56,372

And this is extremely frustrating.

664

01:07:56,713 --> 01:08:07,903

Yeah, it's just like your brain, it feels weird because your brain when it sees a zero, it

knows if this term is not going to be useful.

665

01:08:07,903 --> 01:08:11,005

So you can kind of get rid of it when you do the computation

666

01:08:11,095 --> 01:08:15,047

You you do any computation by hand, you get rid of the zeros very easy.

667

01:08:15,047 --> 01:08:18,111

But the computation does, the computer doesn't know that.

668

01:08:18,111 --> 01:08:24,156

So you have to tell it because otherwise it spends a lot of time doing useless

computation.

669

01:08:24,156 --> 01:08:26,837

And then in the end it's like, yeah, that's a zero.

670

01:08:27,098 --> 01:08:29,800

But then you spent a lot of seconds doing that.

671

01:08:29,800 --> 01:08:31,962

And that's stupid.

672

01:08:31,962 --> 01:08:33,844

But you have to tell it, right?

673

01:08:33,844 --> 01:08:40,468

It's what I tell with computers a lot, Computers are very powerful, but often they are

very dumb.

674

01:08:40,661 --> 01:08:43,222

So you need to tell them exactly what you want.

675

01:08:43,222 --> 01:08:47,104

And that's basically what you're trying to do here.

676

01:08:47,104 --> 01:08:51,905

That's really interesting because that also happens very frequently, doesn't it?

677

01:08:51,926 --> 01:08:53,166

Yeah, yeah.

678

01:08:53,226 --> 01:09:02,670

For those who are curious about it and want to take a deeper dive, Daniel Simpson, he has

a very interesting blog.

679

01:09:02,670 --> 01:09:09,963

And in that blog, he has many posts about doing things with sparse mentacies.

680

01:09:09,997 --> 01:09:22,716

because I didn't mention this, but these matrices can have particular structures and if

they have that particular structure, can exploit some property of matrices and then do the

681

01:09:22,716 --> 01:09:24,386

computation even faster.

682

01:09:25,027 --> 01:09:32,232

like dot products, inverses, transposes, and things like that, determinants.

683

01:09:32,933 --> 01:09:39,237

If you have matrices with particular structures, you can exploit those structures to save

684

01:09:39,841 --> 01:09:41,692

and perhaps also memory.

685

01:09:41,692 --> 01:09:56,908

And Daniel wrote a lot of posts doing things with sparse matrices using Jax, which, know,

PyTensor has these multiple backends.

686

01:09:56,908 --> 01:10:02,010

It has a C backend, it has a Numba backend and a Jax backend.

687

01:10:02,030 --> 01:10:09,193

And what has been frustrating to be honest is that the support for sparse matrices

688

01:10:09,857 --> 01:10:12,578

varies a lot in those backends.

689

01:10:14,339 --> 01:10:24,783

And that's one of the reasons that makes it harder to have something available that works

for most of the cases.

690

01:10:25,164 --> 01:10:32,726

So in my use case, I implemented what I needed for the particular model that I had.

691

01:10:34,567 --> 01:10:38,749

But if you want to have something public,

692

01:10:39,999 --> 01:10:47,053

available for the wider community, it should work in more than just one single case.

693

01:10:47,053 --> 01:11:01,821

But yeah, I think what's needed is a few people with some time to work on that and that

should be it because many things are already invented.

694

01:11:03,282 --> 01:11:05,643

I'm not saying the task is trivial, not at all.

695

01:11:05,643 --> 01:11:07,143

I'm saying it's...

696

01:11:09,805 --> 01:11:15,327

It's about investing time, programming, designing, testing, and all that.

697

01:11:16,148 --> 01:11:17,228

Yeah.

698

01:11:17,228 --> 01:11:19,399

Yeah, so you heard it, folks.

699

01:11:19,399 --> 01:11:29,893

Really, if you're interested in working on that, and you don't need to be an expert on

that because we have people like Tommy on the Pimesy repo who can mentor you.

700

01:11:30,074 --> 01:11:38,557

If you're interested in that and you want to dive a bit into open source, please contact

me and I'll put you in contact

701

01:11:38,763 --> 01:11:43,216

the appropriate authorities, as we say.

702

01:11:43,317 --> 01:11:50,753

And yeah, so we should definitely put that blog post by Dan Simpson in the show notes,

Tommy, if you can do that.

703

01:11:50,753 --> 01:11:59,731

also, is there anything you can share already in the show notes from your custom

implementation?

704

01:11:59,731 --> 01:12:05,015

Yeah, I have all the repository that is public.

705

01:12:05,015 --> 01:12:07,297

Perhaps I can update

706

01:12:07,681 --> 01:12:09,341

with the latest things.

707

01:12:10,742 --> 01:12:15,384

But I do have a few things to share.

708

01:12:15,864 --> 01:12:22,547

Both implementations and experiments of myself testing those implementations.

709

01:12:22,827 --> 01:12:23,888

Nice.

710

01:12:23,888 --> 01:12:25,548

Which implementations are those?

711

01:12:25,548 --> 01:12:28,249

In which cases could people use them?

712

01:12:28,630 --> 01:12:31,370

Just Matrix.

713

01:12:31,571 --> 01:12:34,512

If you write it, it's SPMB.

714

01:12:34,512 --> 01:12:36,252

It's a sparse matrix.

715

01:12:38,441 --> 01:12:39,642

SPMB, think.

716

01:12:39,642 --> 01:12:44,184

But basically sparse matrix dense vector multiplication.

717

01:12:44,384 --> 01:12:46,646

That's what I care about.

718

01:12:46,646 --> 01:12:49,887

But that's in PyTensor.

719

01:12:51,128 --> 01:12:55,510

PyTensor, C, Numba, JAX, many things.

720

01:12:55,871 --> 01:13:00,513

But yeah, it's PyTensor with different backends.

721

01:13:00,973 --> 01:13:08,107

Okay, so it would be like, for instance, you could use that function that's written in

PyTensor.

722

01:13:08,181 --> 01:13:09,301

in a PyMC model.

723

01:13:09,301 --> 01:13:12,712

Yeah, yeah, that's the goal and that's what I did in my use case.

724

01:13:12,712 --> 01:13:14,123

Yeah, yeah, yeah.

725

01:13:14,123 --> 01:13:18,904

It's like you have a sparse matrix multiplication somewhere in your PyMC model.

726

01:13:18,904 --> 01:13:22,975

Instead of just doing pm .math .dot, you would use that custom...

727

01:13:22,975 --> 01:13:24,225

Another function.

728

01:13:24,646 --> 01:13:27,246

You would use that custom PyTensor function.

729

01:13:27,246 --> 01:13:28,127

Correct.

730

01:13:28,127 --> 01:13:37,449

Yeah, but the problem I was telling is, let's say you want to use the great new, not

PySum, okay, then you need a number backend to be

731

01:13:37,645 --> 01:13:41,646

so you have that sparse thing implemented in Lumber and so on.

732

01:13:42,067 --> 01:13:52,841

That definitely would awesome to have people help out on that.

733

01:13:52,841 --> 01:13:59,504

I definitely love to that, unfortunately, I cannot extend my days.

734

01:13:59,504 --> 01:14:02,835

That's really fascinating work.

735

01:14:02,835 --> 01:14:04,416

That's really cool.

736

01:14:04,416 --> 01:14:07,637

I'm hoping to have to do that at one point for work.

737

01:14:07,681 --> 01:14:09,241

So you are forced to do it?

738

01:14:09,241 --> 01:14:13,863

Yeah, either for the Marlins or for the Labs project.

739

01:14:14,263 --> 01:14:22,725

Because then I'm forced to dive into and do it and probably do a PR to finally push that

to Pytancer Universe.

740

01:14:22,725 --> 01:14:26,406

That's how a lot of my PRs end up being, you know.

741

01:14:26,726 --> 01:14:27,857

That'd be great, I'd say.

742

01:14:27,857 --> 01:14:29,807

I'd love that.

743

01:14:29,887 --> 01:14:34,808

I love that because I've definitely been beaten by that before.

744

01:14:35,989 --> 01:14:37,189

that's, yeah.

745

01:14:37,227 --> 01:14:44,681

I had also looked into implementing a sparse Softmax implementation in Pytensor.

746

01:14:44,701 --> 01:14:53,546

If I remember correctly, that didn't need to be very hard and I didn't have a lot of time

to work on that project, so I had to abandon it.

747

01:14:54,207 --> 01:14:57,348

But yeah, definitely that'd be super fun.

748

01:14:57,348 --> 01:15:05,753

Great, so, Tommy, it's already been a lot of time, maybe I just have one more question

before I go to last two questions.

749

01:15:07,771 --> 01:15:19,844

Now I know you, learn a lot of stuff, we kind of work similarly so I think something I'd

like to ask you is what are you thinking about these days?

750

01:15:19,844 --> 01:15:23,868

What do you want to learn in the coming week or coming month?

751

01:15:24,980 --> 01:15:26,632

that's an interesting question.

752

01:15:30,007 --> 01:15:35,020

I've been learning more about hierarchical models.

753

01:15:35,020 --> 01:15:39,303

So it seems like, but shouldn't you already know about that topic?

754

01:15:39,463 --> 01:15:43,846

Yeah, but turns out there are a lot of things to learn.

755

01:15:43,846 --> 01:15:59,873

And so I've been learning about basic modeling and hierarchical models, like in multiple

ways, definitely gaining intuition through like computer exercises.

756

01:15:59,873 --> 01:16:01,233

helped me a lot.

757

01:16:01,694 --> 01:16:17,098

But lately, I went to more formal sources to have a look at the math and have a look at

the properties to better understand assumptions, consequences of those assumptions, trying

758

01:16:17,098 --> 01:16:24,100

to understand when we can avoid computations.

759

01:16:24,100 --> 01:16:28,421

In some point, my understanding was, okay, we have HMC.

760

01:16:28,421 --> 01:16:30,145

This is the best thing in the world.

761

01:16:30,145 --> 01:16:36,528

we pass any model between quotes because it's not any model but let's say any model and it

just works.

762

01:16:36,528 --> 01:16:42,350

Okay, yes, you can have some problems but let's say it just works.

763

01:16:43,151 --> 01:16:52,534

But then I've been learning more about those cases where you can avoid using such a

sampler or you can...

764

01:16:52,715 --> 01:16:59,021

I know it sounds boring to write your own MCMC routine but if you have a A model

765

01:16:59,021 --> 01:17:15,281

that you know very well and that's the model you want to use and nuts is going to take 30

hours because you have millions of parameters probably it's worth it like having a look at

766

01:17:15,281 --> 01:17:27,487

the theory and realizing if you can do something more and I'm learning about that and I

really like it it's challenging I think that with

767

01:17:27,703 --> 01:17:36,739

the experience of having worked a lot with BASIN models, is much easier to digest all

that.

768

01:17:36,979 --> 01:17:41,282

So that's one of the things that I'm learning about.

769

01:17:42,243 --> 01:17:51,809

Another thing that I'm always learning, and there's a book that we have been sharing

lately with folks at Labs and on Twitter.

770

01:17:53,031 --> 01:17:56,971

The book is called, Richly Parametrized

771

01:17:56,971 --> 01:17:58,682

linear models or something like that.

772

01:17:58,682 --> 01:18:03,725

But something about models with a lot of parameters and how to work with those models.

773

01:18:04,186 --> 01:18:07,849

And the book is great.

774

01:18:07,849 --> 01:18:09,069

I enjoyed it.

775

01:18:09,510 --> 01:18:20,417

And the topic is the connection between many different models that seem to be different,

but how they are connected to each other.

776

01:18:20,617 --> 01:18:23,920

And I really enjoy that.

777

01:18:23,920 --> 01:18:26,451

Like, you have a spline model.

778

01:18:27,850 --> 01:18:37,717

You have a model with splines and then you have a hierarchical model but if you have these

particular priors and you go to the models that distribution it matches that other thing

779

01:18:37,717 --> 01:18:56,229

and seeing those connections between the different models and modeling approaches is

really nice because it may seem boring at some point but that's how you

780

01:18:57,089 --> 01:19:01,991

really grasp the depths of something.

781

01:19:03,052 --> 01:19:12,315

So yeah, those are two things I'm learning about these days and I enjoy learning about

those things.

782

01:19:12,456 --> 01:19:16,757

Yeah, I can tell you, you love learning about new things.

783

01:19:18,038 --> 01:19:23,881

I do too, I think that's why also we work so well together.

784

01:19:25,139 --> 01:19:28,012

And if you have a link to the book you just mentioned...

785

01:19:28,012 --> 01:19:30,913

Yeah, I will share the book to edit.

786

01:19:31,574 --> 01:19:35,437

I'm very bad at remembering exact names.

787

01:19:35,498 --> 01:19:41,502

Fortunately, I can just search my computer so I know one or two words and then I can get

what I want.

788

01:19:41,543 --> 01:19:42,803

That's cool.

789

01:19:43,624 --> 01:19:45,145

sounds about right.

790

01:19:46,727 --> 01:19:48,648

Well, Tommy, that's great.

791

01:19:48,648 --> 01:19:50,710

I think it's time to call it a show.

792

01:19:50,710 --> 01:19:52,031

We've got a lot of ground.

793

01:19:52,031 --> 01:19:54,453

Of course, a ton of questions I

794

01:19:54,453 --> 01:19:58,135

Still ask you, let's be respectful of your time.

795

01:19:58,656 --> 01:20:00,968

But before, I'll let you go, of course.

796

01:20:00,968 --> 01:20:02,909

I'm gonna ask you the last questions.

797

01:20:02,909 --> 01:20:06,181

I'll ask you if you had guests at the end of the show.

798

01:20:06,221 --> 01:20:07,662

you could...

799

01:20:08,263 --> 01:20:09,003

No, sorry.

800

01:20:09,003 --> 01:20:14,466

First one is if you had unlimited time and resources, which problem would you try to

solve?

801

01:20:16,208 --> 01:20:23,693

I don't know if this problem has like a particular name, but you know, I enjoyed...

802

01:20:23,709 --> 01:20:28,370

working with samples obtained with MCMC methods.

803

01:20:29,551 --> 01:20:35,312

And it's really nice learning about how they work and how to diagnose them and all that.

804

01:20:35,732 --> 01:20:50,717

But if we could have just a method that gives us real samples from any posterior

distribution that we work with, or we could have a very clever machine that knows the

805

01:20:50,717 --> 01:20:52,577

details about every model

806

01:20:52,577 --> 01:21:08,481

without us noticing, it uses a specific method to give us draws from the posterior,

meaning that you don't need to worry about divergences, convergence, and things like that,

807

01:21:08,481 --> 01:21:12,062

where you can just focus in the analysis of the outcome.

808

01:21:12,062 --> 01:21:13,443

I will work on that.

809

01:21:13,443 --> 01:21:21,575

Because, and it's something I've been thinking more these days, like, now I need to wait

for the compilation.

810

01:21:21,651 --> 01:21:26,163

and now I need to wait a few hours to get the draws.

811

01:21:26,263 --> 01:21:42,700

If I could have something that saved me from that, even though I enjoy learning about how

it works and how to improve it depending on the kind of problems I'm having, yeah, I would

812

01:21:42,700 --> 01:21:51,193

definitely like getting rid of MCMC and just do MC.

813

01:21:52,097 --> 01:21:54,438

But I don't know if it's possible.

814

01:21:54,799 --> 01:21:59,121

But if I'm here to dream, I'm going to have like...

815

01:22:00,982 --> 01:22:02,533

Yeah, a very ambitious dream.

816

01:22:02,533 --> 01:22:03,363

sure.

817

01:22:03,443 --> 01:22:05,445

Yeah, Let's dream big.

818

01:22:05,445 --> 01:22:06,375

Yeah, I agree with that.

819

01:22:06,375 --> 01:22:07,526

Kind of having a...

820

01:22:07,526 --> 01:22:13,389

Yeah, what I often dream about is having kind of like a Javi's like Iron Man.

821

01:22:13,389 --> 01:22:16,510

I mean, like, can you try that version of the model?

822

01:22:17,151 --> 01:22:18,491

Something like that.

823

01:22:19,252 --> 01:22:21,045

that'd be fantastic.

824

01:22:21,045 --> 01:22:22,525

Yeah.

825

01:22:22,926 --> 01:22:23,376

Nice.

826

01:22:23,376 --> 01:22:24,867

then second question.

827

01:22:24,867 --> 01:22:31,731

If you could have dinner with any great scientific mind that alive or fictional, who would

it be?

828

01:22:31,731 --> 01:22:37,534

And keep in mind that you cannot say myself because you already had dinner with me.

829

01:22:37,534 --> 01:22:39,095

then we have to finish the recording.

830

01:22:39,095 --> 01:22:44,518

Yeah, I I knew you were going to answer myself and I definitely appreciate that.

831

01:22:44,518 --> 01:22:47,799

But you already had dinner with me, so you have to choose one of us.

832

01:22:47,799 --> 01:22:49,600

Yeah,

833

01:22:51,647 --> 01:22:53,858

Again, let me explain the answer.

834

01:22:54,399 --> 01:23:02,145

I don't know why, but I'm a fan of movies and documentaries about World War II.

835

01:23:02,206 --> 01:23:16,887

And one movie I enjoyed a lot and like I was really into the movie with a lot of attention

and very interested in what was happening was the, I think in English it is called the

836

01:23:16,887 --> 01:23:20,500

Imitation Game, but in Spanish we call it...

837

01:23:21,857 --> 01:23:24,958

the Enigma code or something like that.

838

01:23:26,279 --> 01:23:29,600

And I really enjoyed that movie.

839

01:23:29,600 --> 01:23:46,197

And I was fascinated seeing the machine moving the things and making noise, trying to

crack the machines to understand the message and then using like, okay, now we have the

840

01:23:46,197 --> 01:23:46,718

information.

841

01:23:46,718 --> 01:23:48,428

What do we do with that information?

842

01:23:48,428 --> 01:23:49,729

So definitely...

843

01:23:51,713 --> 01:23:57,856

I'm talking about Alan Turing and we have dinner with him to talk about everything.

844

01:23:57,856 --> 01:24:14,292

How he was recruited, how they come with ideas, how they used it, what was hard about

making choices because it was both a technical problem but also a political, human

845

01:24:14,292 --> 01:24:15,322

problem.

846

01:24:16,383 --> 01:24:19,264

And then to talk about what happened after that.

847

01:24:19,424 --> 01:24:21,705

So yeah, I think

848

01:24:22,295 --> 01:24:32,412

The bad thing about that dinner would be that I would like it to last for many hours

because I would have many questions.

849

01:24:33,793 --> 01:24:41,859

But yeah, that would be one person I would like to have dinner with to interview and ask a

lot of things.

850

01:24:42,019 --> 01:24:43,570

Yeah, Great choice.

851

01:24:43,570 --> 01:24:45,001

Fantastic choice.

852

01:24:45,001 --> 01:24:48,223

Invite him at Christmas.

853

01:24:49,064 --> 01:24:50,325

Christmas dinner

854

01:24:50,509 --> 01:24:54,909

takes hours, so I think that's That's a very good opportunity.

855

01:24:55,149 --> 01:25:00,309

Whether in France or Argentina, they always last hours, so you know.

856

01:25:00,309 --> 01:25:01,749

That's good.

857

01:25:02,049 --> 01:25:02,809

Awesome.

858

01:25:02,809 --> 01:25:08,829

Well, thanks a That was a blast to finally have you on the show.

859

01:25:09,409 --> 01:25:19,937

More than 100 episodes after you eavesdropped on Osvaldo's door at the Cunicet.

860

01:25:22,294 --> 01:25:29,187

In Spanish, I think you would say, a little bit Quechua, and yeah, I'm sure.

861

01:25:29,187 --> 01:25:30,117

yeah, yeah.

862

01:25:30,117 --> 01:25:46,023

Yeah, that's great to have you on the show, And as usual, we'll put a link to your

website, to your socials, to a lot of links for those who want to dig deeper.

863

01:25:46,124 --> 01:25:49,715

Thanks again, Tommy, for taking the time and being on this show.

864

01:25:50,298 --> 01:25:54,624

Thank you, it was a lot of fun to be honest.

865

01:25:54,624 --> 01:25:59,129

if Alex happens to invite you to the podcast, you have to say yes.

866

01:26:00,692 --> 01:26:01,813

Thank you, Alex.

867

01:26:07,117 --> 01:26:10,820

This has been another episode of Learning Bayesian Statistics.

868

01:26:10,820 --> 01:26:21,309

Be sure to rate, review, and follow the show on your favorite podcatcher, and visit

learnbayestats .com for more resources about today's topics, as well as access to more

869

01:26:21,309 --> 01:26:25,392

episodes to help you reach true Bayesian state of mind.

870

01:26:25,392 --> 01:26:27,354

That's learnbayestats .com.

871

01:26:27,354 --> 01:26:30,216

Our theme music is Good Bayesian by Baba Brinkman.

872

01:26:30,216 --> 01:26:32,198

Fit MC Lance and Meghiraam.

873

01:26:32,198 --> 01:26:35,360

Check out his awesome work at bababrinkman .com.

874

01:26:35,360 --> 01:26:36,555

I'm your host.

875

01:26:36,555 --> 01:26:37,606

Alex Andorra.

876

01:26:37,606 --> 01:26:41,709

can follow me on Twitter at Alex underscore Andorra, like the country.

877

01:26:41,709 --> 01:26:49,014

You can support the show and unlock exclusive benefits by visiting Patreon .com slash

LearnBasedDance.

878

01:26:49,014 --> 01:26:51,396

Thank you so much for listening and for your support.

879

01:26:51,396 --> 01:26:53,698

You're truly a good Bayesian.

880

01:26:53,698 --> 01:26:57,200

Change your predictions after taking information in.

881

01:26:57,200 --> 01:27:03,873

And if you're thinking I'll be less than amazing, let's adjust those expectations.

882

01:27:03,873 --> 01:27:17,029

Let me show you how to be a good Bayesian Change calculations after taking fresh data in

Those predictions that your brain is making Let's get them on a solid foundation

Previous post
Next post