Learning Library

← Back to Library

From Childhood Linux to Red Hat

Key Points

  • Mo’s first encounter with Linux came when his brother bought Red Hat Linux for a college assignment, sparking his interest in customizing and modifying software.
  • He was drawn to the collaborative, open‑source community that let anyone contribute ideas and improvements, giving users a sense of empowerment rather than powerlessness.
  • Concerned that open‑source tools often lack user‑friendly interfaces, Mo pursued a dual degree in computer science and electronic media followed by a master’s in human‑computer interaction to make free software more accessible.
  • After completing his graduate program, he joined Red Hat to learn how the company operates and to help bridge the gap between powerful open‑source technology and intuitive user experiences.
  • His ongoing motivation is the vibrant community culture at Red Hat, where shared knowledge and collective problem‑solving create a supportive environment for developers and users alike.

Full Transcript

# From Childhood Linux to Red Hat **Source:** [https://www.youtube.com/watch?v=SkXgG6ksKTA](https://www.youtube.com/watch?v=SkXgG6ksKTA) **Duration:** 00:34:11 ## Summary - Mo’s first encounter with Linux came when his brother bought Red Hat Linux for a college assignment, sparking his interest in customizing and modifying software. - He was drawn to the collaborative, open‑source community that let anyone contribute ideas and improvements, giving users a sense of empowerment rather than powerlessness. - Concerned that open‑source tools often lack user‑friendly interfaces, Mo pursued a dual degree in computer science and electronic media followed by a master’s in human‑computer interaction to make free software more accessible. - After completing his graduate program, he joined Red Hat to learn how the company operates and to help bridge the gap between powerful open‑source technology and intuitive user experiences. - His ongoing motivation is the vibrant community culture at Red Hat, where shared knowledge and collective problem‑solving create a supportive environment for developers and users alike. ## Sections - [00:00:00](https://www.youtube.com/watch?v=SkXgG6ksKTA&t=0s) **From Queens to Red Hat** - Mo recounts his New York upbringing and how his brother’s purchase of Red Hat Linux sparked his passion for customizing code and joining the open‑source community. ## Full Transcript
0:00Mo thank you for joining me today thank 0:03you so much for having you have um just 0:05about the most Irish name ever I do very 0:09proud you you weren't born in Ireland no 0:12my grandparents oh your grandparents oh 0:14I see where did you grow up New York 0:16Queens oh you're oh see so tell me a 0:19little bit about how how you got to Red 0:22Hat what was your path when I was in 0:24high school it was a chatty girl teenage 0:27girl on the phone we had one phone line 0:29my old older brother was studying at the 0:31local State College computer science and 0:33he had to tell net in to compile his 0:35homework one phone line and I'm on it 0:38all the time he got very frustrated and 0:40he needed a compiler to do his homework 0:42so he bought Red Hat Linux from Comp USA 0:46brought it home and that was on the 0:48family computer so I learned Linux and I 0:50started playing around with it I really 0:52liked it because you could customize 0:54everything like the entire user 0:56interface you could actually modify the 0:58code of the programs you were using to 1:00do what you wanted and for me it was 1:02really cool cuz especially when you're a 1:03kid and like people tell you this is the 1:05way things are and you just have to deal 1:06with it it's nice to be like I'm going 1:08to make things the way I want modify the 1:10code you playing yeah it was amazing and 1:13it was just such a time and like before 1:15it was cool I was doing it and what I 1:18saw in that as sort of the potential 1:20like number one of like a community of 1:22people working together and like the 1:23internet existed it was slow it involved 1:25modems but there were people that you 1:27could talk to who would give you tips 1:29and you share information and this 1:32collaborative building something 1:33together is really something special 1:35right I could file a complaint to 1:38whatever large software company made 1:40whatever software I was into or I could 1:42go to an open source software community 1:43and be like hey guys I think we should 1:44do this and like yeah okay I'll help 1:47I'll pitch in so you don't feel 1:48powerless you feel like you can have an 1:50impact and that was really exciting to 1:51me however open source software has a 1:54reputation for not having the best user 1:57interface not the best user experience 1:59so so I ended up studying computer 2:02science and electronic media dual major 2:05and then I did human computer 2:06interaction as my masters and my thought 2:08was wouldn't it be nice if this free 2:11software accessible to anybody if it was 2:14easier to use so more people could use 2:15it and take advantage of it and so long 2:18story short I uh ended up going to Red 2:21Hat saying hey I want to learn how you 2:22guys work let me Ed in your team draft 2:24out of my graduate program and I'm like 2:26I want to do this for a living this is 2:28cooler so I thought this is the way to 2:30go and I've been there ever since they 2:32haven't been able to get rid of 2:34me to backtrack just a little bit you 2:36were talking about the sense of 2:38community that surrounds this way of 2:41thinking about software talk a little 2:43bit more about what that Community is 2:46like the benefits of that Community why 2:48it appeals to you sure well and you know 2:51part of the reason I actually ended up 2:52going to The Graduate School track 2:54suddenly you're a peer of your 2:56professors and you're working side by 2:58side with them mhm at some point they 3:00retire and you're the Next Generation so 3:02it's sharing information building on the 3:05work of others in sort of this cycle 3:07that extends past human lifespan and in 3:11the same way like the open source model 3:13is very similar but you're actually 3:16you're building something and it's 3:17something in me I'm just really 3:19attracted like I don't like talking 3:20about stuff I like doing stuff with open 3:23source software the software doesn't 3:25cost anything the code is out there 3:28generally uses Open Standards for the 3:30file formats I can open up files that I 3:33created in open source tools as a high 3:35school student today mhm cuz they were 3:37using open format and that software 3:39still exists I can still compile the 3:41code and it's an active Community 3:42project like these things can Outlast 3:45any single company in the same way that 3:47the academic Community has been going on 3:49for so many years and hopefully will 3:51continue moving on so was sort of like 3:53not just the community around it but 3:55just the knowledge sharing and also 3:56bringing up the Next Generation as well 3:58like all of that stuff really appealed 4:00to me and also at the center of it the 4:02fact that we could democratize it by 4:04following this Open Source process and 4:07feel like we have some control we're not 4:08at the mercy of some faceless 4:10Corporation making changes and we have 4:12no impact like that really appealed to 4:13me too yeah for those of us who are not 4:17software efficient AOS take a step 4:20backwards and give me a kind of 4:24description of terms what's the opposite 4:26of Open Source proprietary proprietary 4:28is what we say so so specifically and 4:30practically the difference would be what 4:32between something that was open source 4:34and something that was proprietary sure 4:35so there's a lot of difference so with 4:37open- Source software you get these 4:40rights when you're given the software 4:42you get the right to be able to share it 4:44and depending on the different licenses 4:46that are considered open source have 4:48different little things that you have to 4:50be aware of with proprietary code it's 4:53100% copyright the company even a lot of 4:56times when you sign your employment 4:58contract for a software company you 4:59write code for them you don't own it you 5:01sign over your rights to the company so 5:03if you leave the company the code 5:04doesn't go with you it stays in the 5:06ownership of that company so then when 5:08like one company buys out another and 5:09kills a product that code's gone it's 5:12gone for a business why would a business 5:15want to be have open source code as 5:17opposed to proprietary well for the same 5:19reasons like say you're a business 5:21you've invested all this money into this 5:23software platform right and you've 5:26upskilled your employees on it and it's 5:28a core part of business and then few 5:31years later that company goes out of 5:34business or something happens or even 5:36something less drastic you really need 5:38this feature but for the company that 5:40makes the software it's not in their 5:42best interest it's not worth the the 5:44investment they're not going to do it 5:46how do you get that feature you either 5:48have to completely migrate to another 5:49solution and this is something it's core 5:51at your business that's going to be a 5:52big deal to migrate but if it's open 5:54source you could either hire a team of 5:57experts you could hire software engine 5:59Engineers who are able to go do this for 6:01you go in the Upstream software 6:03Community implement the feature that you 6:06want and it'll be rolled into the next 6:08version of that company software so even 6:10if that company didn't want to implement 6:12the feature if they did it open source 6:15they would inherit that feature from the 6:16Upstream Community is what we call it so 6:19you have some control over the situation 6:20if it's open source you have an 6:22opportunity to actually affect change in 6:25the product and you could then pick it 6:27up or pay somebody else to pick it up or 6:29another could form and pick it up and 6:31keep it going so there's more 6:33possibilities if it's open source it's 6:34more like it's like an insurance policy 6:36almost so Innovation from the standpoint 6:38of the customer Innovation is a lot 6:41easier when you're working in an open 6:42source environment absolutely yeah so 6:46now at Red Hat you're working with 6:48something called instruct lab tell us a 6:50little bit about what that is so the 6:52thing that really excites me about 6:53getting to work on this project is AI is 6:56sort of the has been the scary thing for 6:58me because it's it's one of those things 7:00like in order to be able to pre-train a 7:04model you have to have unobtanium gpus 7:08you have to have Rich resources it takes 7:12months it takes expertise there's a 7:15small handful of companies that can 7:18build a model from pre-train to the 7:20something usable and it kind of feels 7:23like those early days when I was kind of 7:25delving in software in the same way I 7:27think if more people could contribute to 7:30AI models then it wouldn't be just 7:32influenced by whichever company had the 7:35resources to build it and there's been a 7:38lot of emphasis on pre-training models 7:40so taking massive terabytes data sets 7:42throwing them through masses of gpus 7:45over months of time spending hundreds of 7:47millions of dollars to build a base 7:49model but what instruct lab does is say 7:51okay you have a base model we're going 7:54to fine-tune it on the other end it 7:56takes less compute resources the way 7:58we've built instruct lab you can play 8:00around with the technology and learn it 8:02on an off-the-shelf laptop that you can 8:04actually buy so in this way we're 8:07enabling a much broader set of people to 8:11play with AI to contribute it to modify 8:13it and I'll tell you one story from Red 8:15Hat sui who is our chief diversity 8:18officer very interested in inclusive 8:20language and open source software 8:22doesn't have any experience with AI we 8:25have a community model that we have an 8:26upstream project around for people to 8:28contribute Knowledge and Skills to the 8:29model she's like I want to teach the 8:31model how to use inclusive language like 8:34replace this word with this word or this 8:35word with this word I'm like oh that's 8:37so cool so she paired up with Nicholas 8:39who is a technical guy at red hat and 8:42they built and submitted a skill to the 8:44model that you can just tell the model 8:46can you please take this document and 8:48translate this language to more 8:49inclusive language and it will do it and 8:51they submitted it to the community they 8:53were so proud it was like that's the 8:54kind of thing that like you know maybe a 8:56company would be incentivized to do that 8:58but if you have some tooling that's open 9:01source and something that anybody could 9:03access then those communities could 9:04actually get together and build that 9:06knowledge into AI models just so I 9:08understand what you guys have is the 9:11structure for an AI system and in other 9:14cases individual companies own and train 9:17their own AI systems it takes enormous 9:20amount of resources they Hoover up all 9:23kinds of information train it according 9:25to their own hidden set of rules and 9:27then a customer might 9:30use that for some price what you're 9:32saying is in the same way that we 9:34democratized the writing of software 9:36before let's democratize the training of 9:38an AI system so anyone can contribute 9:41here and teach the model the things that 9:44they're interested in teaching the model 9:47I'm guessing you correct me on the one 9:49hand this model at least in the 9:51beginning is going to have a lot fewer 9:53resources available to it but on the 9:55other hand it's going to have a much 9:56more diverse set of inputs that's right 10:00and the other thing is that IBM 10:02basically as part of this project has 10:04something called the granite model 10:05family and they've donated some Granite 10:08models so these are the ones that take 10:09the months and terabytes of data and all 10:12the gpus to train so IBM has created one 10:15of those and they have listed out and 10:18link to the data sets that they used and 10:20they talk about the relative proportions 10:21they used when pre-training so it's not 10:24just a black box you know where the data 10:26came from which is a pretty open 10:28position to take that is what we 10:29recommend as the base so you use the 10:31instruct Lab Tuning you take this Bas 10:34Granite model that IBM has provided and 10:36you use the instruct lab tooling that 10:37Red Hat Works on and you use that to 10:39fine-tune the model to make it whatever 10:42you want what I want to go back for a to 10:44the partnership between IBM and red hat 10:47here with them providing the granite 10:50model to your instruct lab is this the 10:53first time red hat and IBM have 10:55collaborated like this I think it's 10:57something that's been going on like 10:59another a product within the red hat 11:01family would be open shift AI where they 11:03collaborate a lot with IBM research team 11:06like BLM is one of the components of 11:08that product that there's a nice kind of 11:10exchange and collaboration between the 11:13two companies yeah how large is the 11:15potential community of people who might 11:17contribute to instruct lab it it could 11:20be thousands of people I mean we'll see 11:22it's early days this is early technology 11:26that was invented at IBM research that 11:27they partnered with us at Red to kind of 11:30build the software around it there's 11:32still more to go like right now we have 11:34a team in the community that's actually 11:35trying to build a web interface to make 11:37it easier for anybody to contribute so 11:40we have a lot of those sort of user 11:41experience for the contributor to the 11:44model stuff to work out that we're still 11:46actively building on but like my vision 11:48for it even is I like going back to that 11:50academic model of learning from what 11:53others and building upon it over time it 11:55would be very good for us to sort of go 11:57out and try to collaborate with 12:00academics of all Fields like hey you 12:02know the model doesn't know about your 12:03field would you like to put something 12:06into the model about your field so it 12:08knows about it or even you know talk to 12:11the model it got it wrong let's correct 12:13it can we lean on your expertise to 12:15correct it and make sure it gets it 12:17right and sort of use that Community 12:19model as a way for everybody to 12:20collaborate because before instruct lab 12:24my understanding is if you wanted to 12:26like take a model that's open source 12:28licensed and with it you could do that 12:30you could take a model kind of off the 12:31shelf from hugging face and fine-tune it 12:34yourself but it's a bit of a dead end 12:36because you made your contributions but 12:38there's no way for other people to 12:39collaborate with you so the way that 12:42we've built this is based on how the 12:44technology Works everybody can 12:46contribute to it this is something that 12:48you can keep growing and growing and 12:49growing over time yeah yeah what's the 12:52level of expertise necessary to be a 12:54contributor you don't need to be a data 12:56scientist and you don't need to have 12:58exotic Hardware where honestly if you 13:00don't even have laptop Hardware that 13:02meets the spec for doing instruct lab's 13:04laptop version you can submit it to the 13:06community and then we'll actually build 13:08it for you we have Bots and stuff that 13:10do that and we're hoping over time to 13:12make that more accessible first by 13:14having a user interface and then maybe 13:15later on having a web service yeah so 13:18give me an example of how a business 13:20might make use of instruct lab one of 13:23the things that businesses are doing 13:25with AI right now is using hosted API 13:27services are quite expensive 13:29but they're finding value but it's hard 13:31given the amount of money they're 13:32spending and one of the things that's a 13:34little scary about it too is like you 13:36have very sensitive internal documents 13:39and you have employees maybe not 13:41understanding what they're actually 13:42doing CU you know how would you if 13:44you're not technical enough when you're 13:46asking said public web service AI model 13:51information about you're copy pasting 13:54internal company documents it's going 13:56across the internet into another 13:58company's hands and that company 13:59probably shouldn't have access to that 14:02so what both redhead and IBM in the 14:04space are looking at like the instruct 14:06lab model it's very modest it's 7 14:08billion parameter model very small it's 14:11very cheap to serve inference on a 7 14:13billion parameter model it's competing 14:15with trillion parameter models that are 14:17hosted you take this small model that is 14:20cheap to run inference on you train it 14:23with your own company's proprietary data 14:25inside the walls of your company on your 14:27own Hardware you can do all sorts of 14:30actual data analysis on your most 14:32sensitive data and have the confidence 14:34that it's not left the premises in that 14:36use case you're not actually training 14:38the model for everyone you're just 14:40taking it and doing some private stuff 14:42on it exactly which doesn't leave the 14:44building but that's separate from an 14:47interaction where you're doing something 14:49that contributes overall right and 14:52that's that's something maybe that I I 14:53should be more clear about is there's 14:55sort of two tracks here and this is very 14:57red hat classic you have your Upstream 15:00Community track and you have your 15:01business product track so the Upstream 15:03Community track is just enabling anybody 15:06to contribute to a model in a 15:07collaborative way and play with it the 15:09downstream product business oriented 15:12track is now take that Tech that we've 15:14honed and developed in the open 15:17community and apply it to your business 15:20Knowledge and Skills Let's do an 15:22imaginary case study sure I'm a law firm 15:26I'm an entertainment law I have 100 15:29clients who are big stars they all have 15:32incredibly complicated contracts I feed 15:35a thousand of my company's contracts 15:38from the last 10 years into the model 15:42and then every time I have a new 15:43contract I ask the model am I missing 15:46something can you go back and look 15:48through all our own contracts and show 15:49me a contract that is missing key 15:51components or exposes us to some 15:53liability there in that case the the 15:56model would know my Law Firm contracts 15:59really really well it's as if they've 16:01been working at my Law Firm they're not 16:04distracted by other people's particular 16:06Styles or or a bunch of contracts you 16:09know from the utility industry or the 16:13they know entertainment law contracts 16:16exactly yeah and you can train it in 16:18your own image your style of doing 16:19things it's something that your company 16:22can produce that is uniquely helpful to 16:24you no third party could do that because 16:27no third party understands how you do 16:28business business and understands your 16:30history and your documents so it's sort 16:32of a way of getting value out of the 16:35stuff you already have sitting in a file 16:36cabinet somewhere it's it's very cool 16:39yeah give me any sort of a a real world 16:42case study where you think the business 16:44use case would be really powerful what's 16:46a business that really could see an 16:50advantage to using uh instruct lab in 16:53this way the the demo that I've given a 16:56couple times at different events used an 16:58imaginary insurance company so you say 17:00you have this company you have to 17:02recommend repairs for various types of 17:05claims you've been doing this for years 17:07you know if you know the the 17:09windshield's broken and you gotten this 17:11type of accident and it's this model car 17:13these are the kinds of things you want 17:15to look at so you could talk to any 17:18insurance agent in the field and be like 17:19oh you know it's a it's a Tesla you 17:22might want to look at the battery or 17:23something like they'll have some latent 17:25knowledge just so you can take that and 17:28train it into them model honestly I 17:30think these kind of new technologies are 17:31better when they're less visible so say 17:34you have the claims agents in the field 17:36and they have this tool and they're kind 17:38of entering the claim data they're 17:39they're on the scene at the car and it 17:42might say oh look I see this is a Ford 17:45Fiesta these are things you want to look 17:47at for this type of accident as you're 17:49entering the data it could be going 17:51through the knowledge you had loaded 17:53into the model and be making these 17:54suggestions based on your company's 17:56background and hey you know let's not 17:58make the same mistake twice let's make 18:00new mistakes and let's learn from the 18:01stuff we already did yeah so that's one 18:03example but there's so many different 18:05Industries and ways that this could help 18:07and it could make those agents in the 18:09field more efficient have you had anyone 18:12talk to you about using instruct lab in 18:14a way that surprised 18:16you I 18:18mean some people have done funky things 18:22but sort of playing with the skills 18:23stuff that's where I see a lot of 18:25creativity the difference between 18:26Knowledge and Skills is that knowledge 18:28is pretty pretty understandable right 18:30like oh historical insurance claims or 18:32you know legal contracts skills are a 18:35little different so whenever somebody 18:36submits a skill sometimes it tends to be 18:38really creative because it's not 18:39something that's super intuitive 18:41somebody submitted a skill I don't know 18:43how well it worked but it was like 18:45making asy art like draw me a I don't 18:48know draw me a dog and it do like an 18:49aski art dog I mean it's stuff that you 18:51can do programmatically one that was 18:53actually very very helpful was you know 18:56take this table of data and convert to 18:59this format like oh that's nice that 19:01actually saves me time how far away are 19:03we from the day when I Malcolm glob well 19:06technology ignoramus can go home and 19:09easily interact with instruct lab maybe 19:13a few months few months I thought you 19:16going to say a few years no I think it' 19:18be a few months wow I hope it's power 19:23open source Innovation yeah oh that's 19:25really interesting yeah I I'm always 19:27taken by surprise I'm still thinking in 19:2920th century terms about how long things 19:31take and you're in the 22nd century as 19:34far as I can tell honestly the instruct 19:36Lab Core invention was invented in a 19:39hotel room at an AI conference in 19:41December with an amazing group of IBM 19:44research guys December of 2023 wait back 19:47up you have to tell the story this group 19:49of guys we've been working with they 19:52they were at this conference together 19:53and it's a really funny story because 19:55you know it's hard to get access to gpus 19:58and like even you know you're at IBM and 19:59it's hard to get access everybody wants 20:01access they did it over Christmas break 20:04because nobody was using the cluster at 20:05the time and they ran all of these 20:07experiments and I'm like whoa this is 20:09really cool and um wa and their idea was 20:12we can do a strip down AI 20:16model and was the idea and even back 20:19then combine it with granite what was 20:21the core the original idea the original 20:23idea it's sort of multi there's like 20:25multiple aspects to it so like one of 20:27the aspects actually came on later but 20:29it starts at the beginning of the 20:31workflow is you're using a taxonomy M to 20:34organize how you're fine-tuning the 20:35model so the old approach they call it 20:38the blender approach you just take a 20:39bunch of data of roughly the type of 20:42data that you'd like and you kind of 20:43throw it in and then see what comes out 20:45don't like it okay throw in more try 20:47again see what comes out they had used 20:50this taxonomy technique so you actually 20:52build like a taxonomy of like categories 20:55and subfolders of like this is the 20:57Knowledge and Skills that we want to 20:58train into the model and that way you're 21:01sort of systematic about what you're 21:03adding and you can also identify gaps 21:05pretty easily oh I don't have a category 21:07for that let me add that so that's like 21:09one of the the parts of the invention 21:11here Point number one is Let's Be 21:15intentional and deliberate in how we 21:17build and chain this thing yeah and then 21:19the next component would be okay so it's 21:23actually quite expensive part of the 21:24expense of like tuning models and just 21:27training models in general is coming up 21:29with the data and what they wanted to do 21:32is have a technique where you could have 21:34just a little bit of data and expand it 21:36with something they're calling synthetic 21:38data generation and this is where it's 21:40sort of like you have this student and 21:43teacher workflow so you have your 21:47taxonomy the taxonomy has sort of the 21:50knowledge like a business's knowledge 21:51documents their insurance claims and it 21:53has these quizzes that you write and 21:56that's to teach the model so I'm writing 21:57a quiz based just like you do in school 21:59you read the chapter on the American 22:01Revolution and then you answer a 10 22:02question quiz or you're giving the model 22:04quiz you need at least five questions 22:07and answers and the answers need to be 22:09taken from the context of the document 22:11and then you run it through a process 22:13called synthetic data generation and it 22:15looks at the document so it'll look at 22:17the history chapter it'll look at the 22:19questions and answers and then it'll 22:21look to that original document and come 22:23up with more questions and answers based 22:25on the format of the questions and 22:26answers you made so you can take take 22:28five questions of answers amplify them 22:31into 100 questions and answers 200 22:33questions and answers and it's a second 22:35model that is making the questions and 22:37answers so it's synthetic data 22:39generation using an AI model to make the 22:41questions we use an open source model to 22:44do that so that's the second part and 22:46then the third part is we have a 22:48multiphase tuning technique to actually 22:50take the synthetic data and then 22:53basically bake it into the model so sort 22:55of that's the approach a general 22:57philosophy the approach is using Granite 23:00because we know where the data came from 23:02another approach is the fact that we're 23:03using small models that are cheap to run 23:05inference on they're small enough that 23:07you can tune them on laptop Hardware you 23:09don't need all the fancy expensive GPU 23:11Mania you're good so sort of like a 23:14whole system it's like not any one 23:16component but it's sort of the approach 23:18they took was somewhat novel and they 23:20were very excited when they saw the 23:22experimental results there was a meeting 23:24between red hat and IBM it was actually 23:26an IBM research meeting that red Hatter 23:28from invited to and I think the the red 23:30Hatters invol sort of saw the potential 23:33whoo we can make models open source 23:36finally rather than them just being 23:37these endless dead Forks we can make it 23:40so people could contribute back and 23:42collaborate around it so that's when Red 23:44Hat became interested in it and we sort 23:45of worked 23:46together and the research Engineers from 23:49IBM research who came up with the 23:50technique and then my team the software 23:53Engineers who know how to take research 23:55code and productize it into actually 23:58runnable supportable software kind of 24:01got together we've been hanging out in 24:04the Boston office at red hat and 24:06building it out April 18th was when we 24:08went open source and we made all of our 24:11repositories with all of the code public 24:13and right now we're working towards a 24:14product release or supported product how 24:16long did it take you to be convinced of 24:19the value of this idea I mean so people 24:22get together in this hotel room they're 24:24running these experiments over Christmas 24:27are you aware of the experiments is they 24:28running them when did they no I didn't 24:30find out till February you find out 24:32February so they come to you in February 24:33and they say mo can you recreate that 24:38conversation 24:40well our CEO Matt Hicks and then Jeremy 24:43eer who's one of our distinguished 24:44engineers and Steve watt who's a VP were 24:47present I think at that meeting so they 24:49kind of brought it back to us and said 24:51listen we've invited these IBM research 24:53folks to come visit in in Boston you 24:57know work with them like see does this 24:59have any Merit could we build something 25:00from it and so they gave us some 25:02presentations we were very excited when 25:04they came to us it only had support for 25:07Mac laptops of course you know Red Hat 25:10we're Linux people so we're like all 25:11right we got to fix that so A bunch of 25:13the junior Engineers around the office 25:15kind of came in they're like okay we're 25:16going to build Linux support and they 25:18had it within like a couple days it was 25:20crazy cuz this was just meant to be hey 25:22guys you know what these are invited 25:24guests visiting our office see what 25:27happens we end up doing like weeks of 25:30hack fests and late night pizzas in the 25:32conference room and like playing around 25:34with it and learning and it was it was 25:37very fun very cool do anyone else doing 25:39anything like this is not my 25:40understanding that anybody else is doing 25:42it yet maybe others will try a lot of 25:45the focus has been on that pre-training 25:47phase mhm but for us again that 25:50fine-tuning it's more accessible cuz you 25:52don't need all the Exotic Hardware it 25:54doesn't take months you can do it on a 25:57laptop you can do a smoke test version 25:59of it less than an hour what does the 26:01word smoke test me smoke test means 26:03you're not doing a full fine tuning on 26:05the model it's a different tuning 26:07process it's like kind of lower quality 26:09so it'll run on lower grade Hardware so 26:11you can kind of see hm did it move the 26:12model or not but it's not going to give 26:13you like the full picture you need 26:15higher-end Hardware to actually do the 26:17full thing so that's what the product 26:18will enable you to do once it's launched 26:20is you're going to need the gpus but 26:22when you have them we'll help you make 26:24the best usage of them yeah yeah and 26:27there's a little detail I want to go 26:28back to Sure in order to run the tests 26:31on this idea way back when they 26:35needed time on the gpus so this is this 26:38would be the in-house IBM and they were 26:42quiet at Christmas so how much time 26:44would you need on the gpus to kind of 26:47get proof of concept well what happens 26:49is and it's it's it's sort of like a lot 26:51of trial and error right and there's a 26:53lot about this stuff that like you come 26:56up with a hypothesis you test it out did 26:58it work or not okay it's just like you 27:00know in the lab with you know buns and 27:02burners and beers and whatever so it it 27:04really depends but it it can be hours it 27:07can be days it really depends on what 27:10they're trying to do and then sometimes 27:11they can cut the time down you know with 27:13the number of gpus you have so like I 27:15have a cluster of 8 gpus okay it might 27:17take a day but then if I can get 32 I 27:19can pipeline it and make it go faster 27:21and get it down to a few hours so it 27:22really depends you know but it's like 27:24everybody's home for the holidays it's a 27:27lovely playground to to get that stuff 27:28going fast yeah let's jump forward one 27:31year tell me the status of this project 27:35tell me who's using it tell me how big 27:37is it give me 27:39your optimistic but 27:42plausible prediction about what instruct 27:45lab looks like a year from now a year 27:48from now I would like to see kind of a 27:52Vibrant Community 27:55around not just building knowledge 27:58skills into a model but coming up with 28:01better techniques and Innovation around 28:03how we do it so I'd like to see like the 28:05contributor experience as we grow more 28:07and more contributors to be refined so 28:09like a year from now Malcolm Gladwell 28:11could come impart some of his wisdom 28:13into the model and it wouldn't be 28:15difficult it wouldn't be a big lift I 28:17would love to see the user interface 28:19tooling for doing that to be more 28:22sophisticated I would love to see more 28:24people taking this and even using it 28:27maybe you're not sharing it with the 28:28community but you're using it for some 28:30private usage like I I'll give you an 28:32example I'm in contact with a fellow who 28:36is doing AI research and he's working 28:38with doctors there are GPS in an area of 28:40Canada where there's not enough GPS for 28:42the number of patients so you know 28:44anything you can do to save doctors time 28:48to get to the next patient like one of 28:50the things that he has been doing 28:51experiments with is can we use an 28:54open-source licensed model that the 28:57doctor can run on their laptop so they 28:58don't have to worry about all of the 29:00different privacy rules like it's 29:01private it's on the laptop right there 29:04take his live transcription of his 29:06conversation with the patient and then 29:08convert it automatically to a soap 29:10format that can be entered in the 29:11database typically this will take a 29:13doctor 15 to 20 minutes of 29:16paperwork why not save him the paperwork 29:18at least have the model take a stab and 29:20does the model then hold on to that 29:21information and he can inter he 29:23interacts with the model again when well 29:25that's the thing not with instruct lab 29:27maybe that could be a future development 29:28it doesn't once you're doing inference 29:31it's not ingesting that what you're 29:32saying to it back in it's only the fine 29:35tuning phase so the idea would be the 29:36doctor could maybe load in past patient 29:39data as knowledge and then when he's 29:41trying to diagnose maybe you know what 29:43I'm saying like but the the main idea is 29:46somebody might have some private usage I 29:47would love to see more usage of this 29:51tool to enable people who otherwise 29:53never would have had access to this type 29:55of Technology who never like you know a 29:57small country Je GP doctors it doesn't 30:00have gpus they're not going to hire some 30:02company to custom build them a model but 30:04maybe on the weekend if he's a techie 30:05guy he could play with interesting mod I 30:07mean the more you talk the more I'm 30:09realizing that the Simplicity of this 30:12model is the killer app here once you 30:15know you can run it on a laptop you have 30:17democratized use in a way that's 30:20inconceivable with some of these other 30:22much more complex but that's interesting 30:25because one would have thought 30:26intuitively that at the beginning that 30:29the winner is going to be the one with 30:31the biggest most complex version and 30:34you're saying actually no there's a 30:36whole series of uses where being lean 30:40and focused focused is actually you know 30:44it enables a whole class of uses maybe 30:47another way of saying this is who 30:49wouldn't be a potential instruct Lab 30:51customer we don't know yet it's it's so 30:53new like we haven't really had enough 30:56people experimenting and playing with it 30:57and out all the things yet but that's 31:00that's the thing that's so exciting 31:01about it is like I can't wait to see 31:02what people do is this the most exciting 31:04thing you've worked done in your career 31:06I think so I think so well we are 31:10reaching the end of our time but before 31:12we finish we're going to do a little 31:13speed round sure all right complete the 31:17following sentence in five years AI will 31:21be 31:22boring it will be integrated it'll just 31:25work and there will be no now with AI 31:27thing it'll noral it'll be what's the 31:30number one thing that people 31:32misunderstand about AI it's just Matrix 31:35algebra it's just numbers it's not 31:37sensient it's not coming to take us over 31:40it's just numbers you're on this side of 31:42you're on the uh Team Humanity yeah 31:44you're on team un good what advice would 31:47you give yourself 10 years ago to better 31:50prepare for today Learn Python for real 31:53it's a programming language that is 31:55extensively used in the community I've 31:57always in it but I wish I had taken it 31:59more seriously yeah did you say you had 32:02a daughter I have three daughters you 32:04have three daughters I have two you're 32:06if you got three you're you're you're on 32:08your 32:09own are you making them study 32:12python I am actually trying to do that 32:15the we're using a microbit 32:16microcontroller tool to do like a custom 32:19video game controller they prefer 32:21scratch because it's a visual 32:22programming language but it has a python 32:23interface too and I'm like pushing them 32:25towards python good um 32:28chat box and image generators are the 32:30biggest things to Consumer AI right now 32:32what do you think is the next big 32:34business 32:35application private models small models 32:39fine-tuned on your company's data for 32:42you to use exclusively are you using AI 32:46in your own personal life these days 32:48honestly I think a lot of us are using 32:50it and we don't even realize it yeah I 32:52mean I'm a ficio of foreign languages 32:55there's translation programs that are 32:56built using machine learning learning 32:58underneath one of the things I've been 33:00dabbling with lately is using Tex 33:01summarizations because I tend to be very 33:03loquacious in my note taking and that is 33:06not so useful for other people who would 33:08just like a paragraph so that's 33:09something I've been experimenting with 33:11myself just to help my everyday work 33:13yeah we hear many definitions of open 33:16related to technology what's your 33:19definition of open and how does it help 33:21you innovate my definition of open is 33:24basically sharing and being vulnerable 33:28like not just sharing in a have a cookie 33:30way but in a you know what I don't 33:32actually know how this works could you 33:34help me and being open to being wrong 33:37being open to somebody helping you and 33:39making that collaboration work so it's 33:41not just about like the artifact you're 33:42opening it's your approach like how you 33:45do things being open yeah yeah Mo I 33:48think that wraps us up how can listeners 33:51follow your work and learn more about 33:53granite and instruct lab sure you can 33:55visit our project web page at instruct 33:57lab . or you can visit our GitHub at 34:00github.com instruct laab we have lots of 34:03instructions on how to get involved in 34:05instruct lab wonderful thank you so much 34:08thank you malcome