Our guest today is another fellow advocate, Oleg Šelajev. Oleg is a developer advocate at Docker working mainly on developer productivity, Testcontainers, and improving how we set up local development environments and tests. In today's episode, we discuss Testcontainers, Oleg's NODES session, and AI in the Java ecosystem.
Speaker Resources:
Tools of the Month:
Announcements / News:
Articles:
Videos:
Events:
Jennifer Reif: Welcome back graph enthusiasts to GraphStuff.FM, a podcast all about graphs and graph related technologies. I'm your host, Jennifer Reif, and I am joined today by fellow advocate Jason Koo.
Jason Koo: Hello. Good morning, everyone.
Jennifer Reif: And as our guest today is another fellow advocate, Oleg. Oleg is a developer advocate at Docker, working mainly on developer productivity, Testcontainers, and improving how we set up local development environments and tests.
He's a developer, author, speaker, Java champion, Docker captain, and loves all languages. So welcome, Oleg. Thanks for being here.
Oleg Šelajev: Thank you for having me. It's very exciting.
Jennifer Reif: Do you just want to kind of start off a little bit talking, maybe about Docker and how you got into that and kind of just start us off?
Oleg Šelajev: Oh yeah, absolutely. Right. So, so it all started many, many years ago, and I was a backend developer. I was a Java developer at different companies and then eventually an opportunity showed to transition into more developer relations role on the marketing team. And that, that kind of sort of was the start of my career for being a developer advocate.
And then, after a couple of places, I joined the, startup by my friends. They were called AtomicJar, and they were the maintainers of Testcontainers libraries. And for the longest time, I was a big fan of Testcontainers. We use them at work. And, I, it was, I was always, aware how amazing that technology is and how, how interestingly.
It has a very unique spin on the developer experience, out of all things. So, so we talked and then I, I joined the, a very small startup as the first ever hire, back in 2021, I think. So, like back a few years back, right? And Testcontainers are, of course, the open source libraries. You can use them to manipulate and create and configure things in containers and use them for pre production use cases.
But what they allow you to have is to not rely on the third party pre provisioned services with the locally or for your test and CI, and they only the only required Docker environment to run, right? So, like, they, they take a Docker environment, and then you, as a developer, you can create environments with all the services that you need by yourself.
And well, that's that's the very tight integration between two technologies. And, eventually, in December last year, Docker has acquired AtomicJar startup, and that's how the whole Testcontainers team joined, joined the Docker team, which was, absolutely excellent because, well, Testcontainers was depending on Docker environments and the Docker runtime availability.
So there was a very clear synergy between the technical parts. But also, I think our, our engineering culture had a very good fit with Docker, and Testcontainers libraries in the projects also benefit from a lot from having a tighter connection to the upstream Docker projects, right? We can, we can make sure that it works with Docker Compose better, and we can maybe introduce some optimizations for into like a runtimes to make it more efficient for the particular test containers use cases.
So it was a sort of a match made in heaven. I think, I think it made sense for both parties. And then, so that was December, and then after a few months of integration, I finally joined the official Docker DevRel team. And, where my areas of interest are currently the Testcontainers, still, developer experience in general.
And now, I'm also transitioning to having AI use cases, under my, under my sort of, watchful eye. But there I mostly learn about things. I have opinions, but please do not take those opinions as, as the ground truth yet. We'll, we'll get to that level of confidence somewhere by November. Just kidding, just kidding, just kidding, just kidding.
No, it's a very vast field, and there, like a lot is happening every, every, every week, literally. So I'm there just learning and trying to match that with my previous experiences as a Java developer, as a normal, like nine-to-five engineer. Yeah, so sort of how to, how to befriend that with this spurious explosive growth of the AI ecosystem and everything.
I'm sure like from Neo4j you're also very familiar with, with how disruptive the AI ecosystem currently can be.
Jennifer Reif: Yeah. Yeah, I think we're all kind of feeling that. Learn, everybody's on this learning curve, you know, some, some, you know, a little further, a little more behind than others. But yeah, we're all just trying to figure it out and see how it best fits.
Jason Koo: Oleg, since you brought up GenAI and kind of alluded to November, can you tell the audience a little bit about the session that you submitted for NODES conference?
Oleg Šelajev: Yeah. Yeah. I, thank you. Yes. I didn't actually allude to anything in particular. I just picked some time in the future. It's sort of half, half joking about that I'm going to have confident opinions about something in like three months.
But you are right. You're right. Early November, there is the NODES conference happening that people can join online, and I submitted a session about GenAI, productivity for GenAI applications. And we're going to look at, and we're going to, we're going to explore how application developers can start using GenAI technologies in their applications as if those were normal components. That, the same as the rest of the ecosystem that we currently have, we sort of, I, I would, I haven't, like, I hope that as the industry figured out more or less how to build applications that just depend on third party services, like, and how to build microservices that talk via message brokers and things, right? But now currently, what I think is happening very, very much, or will happen eventually, is this commoditization of GenAI technologies, right?
So, it started with the release of chat GPTs, like a few years back, right? But now more and more people are experimenting with those, and it gets the technology and the products there get more end user friendly, right? And you don't need to have a like a PhD in AI or have a lot of experience as a machine learning engineer to build some applications using those services or using those services as a part of the application in some capacity.
But, I think also there are, there are some interesting implications of that. Well, through our years, we figured out how to build normal applications where every line of code is actually a line of code describing what has to be done. And there were processes around that.
There is the whole DevOps culture. It wasn't created on a whim, right? Like, we didn't, we didn't one day wake up and we're like, Ooh, DevOps is how we can make a lot of money for the whole ecosystem, right? No, it was like, we figured out that without tasks, our applications are not reliable. Right? They break and then we don't know how to change them. And without the, sort of a CI system, everyone is like, "Oh, it works on my machine."
And then the operations people are like, "Oh, we'll have to ship your laptop to the customer then." And, and, and then that's how we got to CIs where there's a single source of truth. And then we figured out that the monitoring is essential, right? In, in, in production on the outer loop of the DevOps process, because, you know, if we don't get the signal back into development, into the hands of people who actually need to change things in the response to like production, then, our deployment cycle will be like once every two years.
And then we're like, we are not moving forward. So, like all those things were created as response to the challenges of the real life and how actual projects work. So like, if you will, this is, this is the scar tissue that teaches us that like how, how we solve some problems. So, and now we kind of figured out that, right?
And now we are bringing those unknown components or we could be bringing them into applications, right? And now some part of my application functionality is going to be an LLM doing something. For example, looking at the, some data set, right? And suggesting, suggesting me some, some fuzzy analysis based on user input and some information that is currently in an unstructured textual form, which is currently, LLMs are great at that, right?
They take unstructured data and they transform that into a structured response. That's at least, they can do many things, right? But like that is sort of where they currently excel. So, and then, so if part of my application is, is this functionality of my applications, LLM, and I'm an application developer, how do I need to transform my DevOps practices to accommodate for that?
How do I test applications where some things are GenAI, which is non deterministic. How do I need to think about security? Can I, can I, can I take a random Hugging Face model and like package it somehow by Ollama and have it as a part of my deployment? And then does it, like what security risks are there in general? Or, or like, do I need to think about security differently?
Jennifer Reif: Right.
Oleg Šelajev: Because we know that as developers currently, we do need to think about supply chain security and all those, finding like, there were a myriad of tools that they're also shifting that responsibility more and more left to developers, right?
So now we shift using AI to developers, as well. Developers need to learn and know more about those processes. So how do I do monitoring of the system that includes AI components? Like, do I just, do I just do the normal monitoring? Do I need to monitor for like latency and context, like token counts and whatnot, right? Or can I do additional things? Because in the past, up until now, those components were catered by machine learning engineers.
Jennifer Reif: Yeah.
Oleg Šelajev: Right. Or the teams of machine learning engineers. And they have this, they have this MLOps kind of sort of cycle, how to do that, which sort of mirrors the DevOps cycle, but it's different.
But now, with commoditization of AI, now part of those things developers will need to do themselves or understand at least where they can do something and where they need to rely on actually trained machine learning teams to supplement that. So we're going to explore that. I'm not saying that I have the ultimate answers.
But we do have, we are running like a ton of small experiments. We're trying to figure out some things. And then, by the NODES conference, I'm sure we'll have a much, much better understanding of how this DevOps process could look like and where are they actually like blind spots in there for application developers?
And how to, how to not be completely lost When your boss comes to you and says like, "Ooh, now implement this fuzzy, but extremely important functionality using AI." And as a developer, you're like, well, I don't know what to do.
Jennifer Reif: Good luck.
Oleg Šelajev: So this is what the session will be about. I personally find this as an extremely interesting topic, and I'm looking forward to learning more myself about this and preparing, and synthesizing other people's opinions, and sort of, not into best practices, right?
I don't want to call it best practice, but I don't, cause I don't think by November we're going to actually have and understand what is the actual, like, it's not the best practice. But, outlining the questions, suggesting some solutions, and synthesizing other people's opinions about how to do this into a digestible format. That's what this session will be about.
Jennifer Reif: I think that'll be super valuable. I feel like we've, the last probably year or so ish, we've done a lot of experimenting, right? We've done a lot of playing around with these technologies and seeing how to integrate and seeing how they work and trying to get more deterministic or less deterministic and seeing how creative they can get and, and all these sorts of things.
We're, we're exploring the power and experimentation phase of that. But I think this year, and as we go into the next half of this year, I think that trend is going to slowly shift to, okay, how do we implement these things now?
How do we monitor these things? How do we maintain these over time? And I think that DevOps perspective is going to be super important and looked at a little bit more in depth as we get closer to the end of this year.
Jason Koo: And you had mentioned, like, you know, you don't think you'll get to like the best practice phase yet, but it's this is clearly a very important step to that, right? And I don't know, I'm going to bet that that you'll have basically, you know, a draft of best practices by by the time we get there, because, you know, things are moving so fast, you're moving very fast.
And just doing a lot of work. It sounds like it's going to be a very meaty session. So it's,
Jennifer Reif: And who knows? Maybe
Jason Koo: Yeah.
Jennifer Reif: Maybe your, you know, thoughts and best practices will set the industry standard, Oleg.
Jason Koo: Yeah.
Oleg Šelajev: Yeah. Well, like it's very important. I think to, to have answers to the questions like that actually makes sense.
The first very important step is to start asking those questions, well, for real, right? And then asking, asking different people who are, implementing and working with that stuff. And then trying to figure out whether that matches your expectations and opinions. Cause it's also like the AI field is super, super fragmented, right?
There are, there are like those foundational model teams. And they are very close to academia in the sense that they're doing their stuff and there are like a dozen of them in the world and then nobody else even touches the bubble they're in, right?
Those are the teams that, I don't know, OpenAI and like Meta and Google and, and others, right? They're pushing the boundaries of what we know as humanity about AI in general.
And then, there are also like one step closer to myself as a normal Java developer, right? Are say the machine learning engineers and teams, and they are, they're doing similar things, right? They can maybe retrain model or fine tune models, but like, they are doing this MLOps and they have training and they understand what they're doing and, and they, they've been doing that before...say like, how do you call it? B, BG, right. Before chat GPT, right. For, for years, right? Because we were doing machine learning before that as well. And then, so they have, they have tools and they have like this internal experience about like what makes sense and not make sense in machine learning or AI, from experience.
And then there is the whole category of tools and supporting vendors, which are doing developer tooling, which is, this is Neo4j, right? You were, you were the, the Neo4j database can be the vector database, for the solution. This is Docker cause we are providing the runtime for some AI components, and we can be used as a sort of Docker containers or images can be used as a delivery mechanism.
And there are others, there is like Ollama for the runtime. And there is the frameworks that simplify this, like integration frameworks, like LangChain or even like Spring AI in the Spring Java ecosystem, right? They, they're all kind of sort of developer tooling to make it easier to consume. And there is like the vast category of application developers who eventually might be the end users of all those previous technologies.
Okay. And then we, we might think that, Ooh, but like chat GPT doesn't need end users developers as, to be a successful product. Which is, which is true, right? Like you can build you can build AI products that are replacing, sort of replacing people or services, right? You can have a virtual assistant that is AI powered, or you can have a, you can have a booking agent or a support agent. Their AI component replaces like a person, right? Or a person's job.
And, but you also can have AI components that are replacing parts of functionality. And for that, you need to make sure that application developers can work with all those previous, previous groups or the outputs of those groups. And, I think not a lot of, not a lot of AI discourse currently, yet is talking about how that massive, massive group of, well, essentially end users to this technology, might need to be integrated into this.
Cause again, like for, well, I'm not a Python developer. I can read Python. I can write some Python because it's the language is not the most complex, but it's also, it's foreign to me. I can work with it a little bit, but, but, but my experience is mostly in Java.
And there are a ton of developers who are like more familiar with, say, Node.js ecosystem or, or like .NET ecosystem or Golang ecosystems. And then, bery often currently, when we talk about the AI, it's, it's like, Ooh, you can do this in Python with this bleeding edge development of a framework that you can use.
And then, that sort of doesn't help enterprises that much yet. Right? Because, so like, it's going to be very, very interesting to see where the industry is moving from that perspective.
Jason Koo: If you had to make a prediction, like, so. Right. There's a lot of tooling for GenAI and Python and JavaScript.
Do you, I mean, obviously it'll spread into other languages, but which, which, which next language would you like to see it more aggressively, kind of, come into a maturity?
Oleg Šelajev: I think my, like what I think makes a lot of sense is Java. Because Java is, like, very often it's seen as a de facto language of enterprises. So, there were a ton of developers, there were a ton of mature tooling in Java. There is a lot, there were, like, many opinions about Java as a language and JVM as a platform and those are both, well, sort of informed and sometimes a little bit outdated.
But like there are also, we cannot deny that there are a lot of enterprise developers who are working with Java, right? So if, if, if someone can get the market of Java developers and make it easy for them to use particular technologies, I don't think it's going to be a bad bet for any, any technology, for any particular like sort of developer tooling place or from any AI technology point of view, right?
There is, there is a certain appeal to have Java ecosystem on your site. I think we learned that very much from the cloud race as well, where Golang is a de facto cloud language, right?
A lot, a lot of cloud technologies are much, much easier to consume maybe via those more interpretive languages that have less ceremony, right? But every cloud worth their salt, I think, at least for some time before the AI hype got all of us, everyone was trying to get Java developers to their cloud, because that's the massive workloads. That's a massive amount of people writing and maintaining those applications. Those are the applications that are written in banks, at insurance companies, at like national institutions. Everyone, everyone, and like their neighbor are writing Java. And it might be less popular in some, in the startup culture where you need to move fast and disrupt the industries, right, to, to make it, to make it to the unicorn levels of success. But if you are a bank, Java is, is, is the workhorse. And currently we are seeing, as well, a lot of the AI technologies are getting more and more popular in the Java ecosystem, right?
There were projects like Langchain4j, which is a, it's not a copy, but it's a similarly inspired project from LangChain, but in the Java ecosystem integrating various AI parts. And then there are application frameworks in Java, which are helping with integrating AI parts into their things, like for example, Spring AI, or I know Quarkus is...part of that is done by Testcontainers and, which makes me very, very happy as well.
Jason Koo: Since you brought up, you were talking about Java and you mentioned Langchain4j, and since Neo4j also has, you know, a Java sort of origin story.
Oh, like when, when did you become aware of Neo4j or graph databases? What was your introduction to the space?
Oleg Šelajev: That's a, that's a good question. It was so long ago that I don't actually remember. I did actually, I did recently, look at that. So, I think that was 2013, so more than a decade ago.
I was a part of the company called Zero Turnaround. We did tooling for Java developers, and it was based in Estonia. And to give back a little bit and to kind of sort of re energize the community locally, we decided that, Oh, we are a cool startup. Let's kind of have a local conference. So we organized the conference, and I think in 2013, there was a session, by Michael Hunger.
So, there was a tutorial and the practical sort of hands on experience with Neo4j by Michael. And we looked at Cypher, right?
We'll look at Cypher and we looked how you can do like a little bit of data manipulation with some Cypher queries, right? And if I remember correctly, it was Neo4j 2.0 ish time, which is very, very old, right? So since then, I think Neo4j was always on my radar.
It always was around the things that I was doing. At some point of time, I was on the GraalVM team, which is the sort of, the polyglot runtime, right? And they have, interesting features for Java ecosystem from compiler to this, ahead of time compilation, native image utilities. But it's also it was offering, and still offers, like a polyglot runtime, and it actually a way to define and develop, programming languages on top of JVM in a unique way.
And then, at that time, I remember we worked a little bit with Michael Simons, so my Neo4j knowledge comes from Michaels. Michaels are the essential, the essential tool in, in reaching out to the community. Neo4j, take note.
I'm kidding, but they're both excellent, excellent, engineers and incredible educators. So I think Michael Simons looked at that, and he was also doing some...Cypher has the functions, right? Like in Cypher, you can create functions, and it can call some pieces of code, right? So Michael had some experiments where those functions could be powered by different programming languages that are available on the GraalVM, in the runtime.
Neo4j always kind of sort of nerd sniped me with the sort of the cool factor. Right. Because, I don't know. Somehow the demos were always great. Right. And then, like the Neo4j console that you have, right, where by default, you can explore the data that is mind boggling.
Like, it's just any project that wants to be successful should include something as kind of polished to work with its primitives. That was 2018 ish or up to 2021. And then with Testcontainers as well, because Testcontainers is a way to run things, and in containers, you can do that when it's Neo4j.
So if you want to develop your applications with Neo4j, you can use a Testcontainers module and you can just say like, Oh, new Neo4j container. And that will spin up your Neo4j in the container. You can pull data in, make your application work with that as if that was your production database.
And then with Testcontainers, you get this repeatable out-of-the-box environment for your tests. So local development, was this, so there was a lot of synergy between Testcontainers, Neo4j as well. So it was always sort of on the parallel track next to my interests.
Neo4j as a company is very, very interesting. And, and, I, I think, has a particular great understanding of developer experience and sort of developer ecosystems at large.
If you're looking at the graph databases, use cases, or, or a little bit of the RAG implementations, take a look. They, they are good people in the good products.
Jennifer Reif: Well, thank you.
Jason Koo: Thanks. Thank you.
Jennifer Reif: Oleg and I have run into one another. For the last several years at various conferences and in different places, and it's always a pleasure to run into him and talk tech.
Did you have any just quick tidbits that you would recommend for people who might want to submit a talk to a conference, or kind of, topic matters or tips and tricks or things you might have for somebody who might be new?
Oleg Šelajev: Right. Yes. This is, this is a good question.
So the first and foremost, it's you just have to do this, right? This is not, it's not knowledge. You don't learn how to do this from the books. It's a skill. You need to practice this. And it's a journey. Like, if you're considering what do I talk about? Dig into your experiences, figure out what you are good at. You don't have to be the world class expert on any particular topics for your session to be useful. There is a lot of value in being relatable on stage and explaining and talking about particular things from the point of view of a normal human being.
You can have an experience from a particular project or a particular innovations that you did within the project and reach for content from there and be like, Ooh, this is how we build things manually and what we discovered about a particular topic before it was cool. Right? This is, I don't know, this is how we use Testcontainers for our development set up before Spring Boot integrated that as a default solution, right? And this is what we learned. Because when you work on something, you become an expert in that particular thing and you have insight that nobody else has, right?
Because of your experiences. And as long as that is sort of relatable, it will be interesting. So there is that. You have content, just try to put it on paper, try to submit.
The second part of advice is that when you're putting things into abstract or into the submission form, you are not doing that for the actual attendee at this stage yet. You are doing that for the program committee or somebody who needs to look at that and say, like, yes, this session needs to be in our program.
So you need to first worry in that for passing the CFP in whatever form that is. And there is a lot of ways how to do this and the best is, is just to do this and show it to your friends. Form connections. I know it's hard talking to other people.
Ah, how horrible and complicated and unnatural that is. But really, doing the first, making the first step and showing it to people will get you halfway there, right? They don't have to be an experienced speaker themselves or a conference goer, but just the second pair of eyes can help you a lot, right?
Because also when you do something, you feel weirdly attached to that. And then, other people look at that and say like, well rejected, we don't need this. And then it's also hurts very much.
So that's the third piece of advice. It's going to hurt. Rejection hurts. Everyone has that. I'm rejected from conferences all the time. Just continue working in that. It's not it's not you as a person. It doesn't reflect on you as a quality as a person. It just means that your words in the submission weren't convincing enough. It doesn't even say anything about the quality of the topic or the idea. It just means that it failed to convince the particular set of people who looked at them. So that's three.
And then fourth, in one point of time here, you submit and describe and convince people to give you time on stage. And then, on the other time spot here is where you actually need to be on stage. And then there is some time between those, and usually a significant one. So you don't have to have to be as ready as you need to be on stage to submit.
You don't have to figure out all the details. You're going to have the rough idea and you have to kind of sort of have a gut feeling that there is enough content that is relatable and interesting. And this is, I think it, it tripped me very, very much in the beginning of my career where I would be like, Oh, I have to have the content ready to like say 90 percent before I can submit and talk about this. That's not true. If I imagine what needs to be there, if I think those problems and solutions will be interesting enough. And I know that I have enough time in my, in my work life or in my life to prepare that if somebody will say like, "yes, Oleg, you, we want to give you a stage."
It never happens that it's next week. Usually it's a few months at least. Right? And then, then you have the timeline, then you have the deadline and you can work on it. And make progress as with any other task, right?
So, so those items are separate in time. You should not forget about that. And then with those three or three and a half ideas, I think you can, you can start conquering the, the world of public speaking. It's very rewarding, after you've done it. It's very nervous and stressful when you have to prepare the last couple of days.
You will get the jitters. You will get the public stage fright, and then my hands shake all the time before I go on stage.
Jennifer Reif: Same.
Oleg Šelajev: A little bit. Like I tell myself that this is like a good horse shakes a little bit before the race, you know, like in the anticipation rather than stress. But it doesn't go away for me at least.
And, but like after, and when you're on stage and you can connect with people and you see that they're responding in the way that you imagined, they, that that is an incredibly rewarding feeling.
Jennifer Reif: Yep.
Oleg Šelajev: So do it. I recommend to do this. If, if you want more personal advice, reach out, we can, we can talk. I can maybe help personally as well. So...
Jennifer Reif: fantastic. Thank you so much. Just to kind of wrap up this piece, are, do you have some upcoming events or places you're going to be that you want to give a shout out to for listeners?
Oleg Šelajev: Yeah, absolutely. Summer is usually a little bit slower time for conferences because lots of people are on vacations.
I think my next, my next public, slot will be at, WeAreDevelopers in Berlin, in July, I think July 17th to 19th. There are a lot of topics across all kinds of, themes and questions and areas. I'm going to be talking about Testcontainers and mocking technologies.
Looking at Wiremock, for example, for mocking services for your, for your integration tests and for your local experience, local development experience. But they're also, they're also quite a bit of AI related events there. And I think even Docker has an event that we're participating in somewhere next to the conference as well, which will be like a meetup one of the days.
Jennifer Reif: Yes. We've got that on here as well, actually. The Ollama and Friends Coming to AI Tinkerers. Does that sound right?
Oleg Šelajev: Yes. Yes. That's, that's us. Yes.
Jennifer Reif: Okay, great.
Oleg Šelajev: We are there together. Perfect. Perfect.
Register for the AI thinkers. It's, we'll, we've done, we've done this, a little meetup similar to this just last week in Paris as well. And it was an absolute blast, and I couldn't participate, which is very, very sad. But, but it was very incredible and I would sincerely recommend if you are a developer and you were interested in AI topics, in the topics of how to work with it, what technologies are there, what tools are there, consider joining. I think it's going to be very, very interesting. And well, it's Berlin in July. What's not to love?
Jennifer Reif: Sounds great. Do we want to highlight our tools of the month really quickly?
Okay. I guess I'll drop in and go first since I'm already chatting. So I had played around with Kubernetes for a little side project I've been working on and running Neo4j in Kubernetes. And someone had pointed out to me that the Neo4j documentation on this was actually superb.
And I followed through the documentation and they are definitely right. The documentation is very thorough. There's a lot of good detail there, plenty of links for like sub steps and other things that you might need or continuing education places to go. So this was really, really helpful. A nice walkthrough, a little bit of jumping around to different links for pre steps and, you know, intermediary things going on, but for the most part, very fluid, nice tutorial walkthrough documentation.
So excellent job there just to highlight to that. It was very easy to follow and really nice. So if you're interested in running Neo4j on Kubernetes, definitely start there, and then jump to wherever else you need after that.
Jason Koo: Cool. I'll go next so we can close out with Oleg's tool.
So I've been working on a blog to talk about the Neo4j's new GenAI Python package, and while I was doing it, I was like, I want to hear a better hero image, something that was animated. So I used, Haiper happier, H A I P E R dot AI. It's a text-to-video generator, but it also takes in images, right?
So you could drop an image in and tell it to kind of modify that image and turn it into an animation. So I took a custom image from dream studio, modified it a little bit and then threw it into a Haiper to make it into an animated video, which then I turned it into an animated GIF. And it was, I thought it came out really well.
And I thought, I was, I was impressed with, that, that service. So that's, that's my tool of the month.
Jennifer Reif: That sounds like fun.
Jason Koo: It was. Yeah. Oleg, what's your favorite tool of the month or something you'd like to recommend to people?
Oleg Šelajev: Yeah. I don't know how favorite, how favorite it is yet, but lately, I've been working about like in AI, right.
And then specifically from the point of view of this application developer, getting to work with AI. And one tool that kind of sort of is very, very interesting - and I'm still learning how to apply this properly - but this is, this is the tool of the months for me is the ragas, framework, which it helps you evaluate your, RAG will be your retrieval augmented generation pipelines, right?
So what, what it has, it has, it allows you to run sort of tests on synthetic data or some other data on your, on your RAG setup, and then it uses some additional LLM to evaluate proper, like evaluate parts of the responses that your AI component will give you. And it packages there, in this like open source experience where you can, where you can start using that, and it gives you sort of a repeatable experimental setup for understanding the responses that your models give you and provides you with a sort of a framework to, sort to evaluate that. Maybe not to not evaluate as an application developer. Maybe you don't actually have the particular skills for that, but to understand what is happening. And you can, you can use that within your development process to sort of track maybe and understand some changes to those LLM components in your RAG application.
It sounds very, very interesting, and it sounds like ragas is one of the projects where it has a lot of, a lot of going for it, from being open source, for example, and providing you with an opinionated framework for some things, which is very, very good to think for developers because I don't need to reinvent the wheel out of the actual like lower level tools, maybe.
So ragas.io is the website. And if you are building with AI, or if you're building RAG applications, maybe we'll take a look at that. I don't fully understand how, where it should fit the development process yet. Whether like, do I run it in CI only? Do I like, does every developer run it like locally when they introduce changes?
Do I run it only on like pre production environment, to evaluate it? Like sort of once we're, when set of changes comes in. But, those are additional things, questions to figure out. But it serves like it's currently my tool of the, of the months to take a look at.
Jason Koo: Okay. And how do you interface with RAGAS is it like config based or are you interfacing through Java?
Oleg Šelajev: Well it's currently I think there is a, just a service. Right? So like you, you can, you can run it, and, like you install it with, with, with Python and then you run it, and then you have sort of your tests you'd declared there.
Jason Koo: Okay.
Oleg Šelajev: So we'll...currently, I think the, it's like a set of mix of Python and some dashboards.
Jennifer Reif: Very nice.
Oleg Šelajev: But I'm sure they have like a ton of integrations with other projects and everything, if you need to take a look at that.
Jason Koo: Okay, we'll put the link in for everyone to look at.
Jennifer Reif: Yeah. And then just a couple of highlights for Neo4j segment stuff. There was a, a blog post and several tools that have been highlighted for the GenAI ecosystem tools.
So if you're in the Neo4j GenAI space, check out the blog post and then all of the links within that for Neo4j Knowledge Graph Builder, NeoConverse, which is the text-to-Cypher, and there's a bunch of LLM framework integrations, several of which have starter kits that go along with them. There's also a new project that's kind of being highlighted this month in Neo4j called Project Runway.
And there's a GitHub repository that I'll link as well. That is data, kind of integration or, I guess kind of passing back and forth if you want. Do you want to talk about it, Jason?
Jason Koo: Yeah, just at a high level, right? It's, it's, I think it was mainly designed for helping people move relational data from more traditional databases into a Neo4j instance.
So it's, I think it abstracts away some of, some earlier complexities to make it easier was, was my understanding.
Jennifer Reif: Nice. Great. There's several blog posts out. Again, one on Project Runway, as well as on the GenAI ecosystem tools. So if you're interested in those two projects, there's blog posts that accompany those. There's some others out there for some starter kits and more kind of GenAI things and a GenAI graph gathering that went on this month as well. As far as upcoming events. We mentioned Oleg's where he's going to be for next month. At the end of July, I will be at THAT conference in Wisconsin.
So if you're up around the Midwest US area, then definitely join us there. It's going to be at a, a fabulous event, or so I've heard. This is my first time going, so I'm really excited about it. And then, Jason, any place you're going to be in, in July?
Jason Koo: Yes, so I'll be traveling in Japan in July. So July 9th, if anyone happens to be in or near Osaka, I'm going to put the, I put together a kind of a graph meetup with Koji.
So, yeah, if you're in that neighborhood, definitely come out and join us.
Jennifer Reif: Perfect. We didn't have quite as many events this July, which was kind of nice for a change of pace. I guess, because like Oleg said, the, the summer holidays and so on. But, there's several events, kind of scattered throughout the globe, so if you're around and interested, definitely check those out.
I'll link everything in the show notes as per usual. And we want to give a huge shout out to our guest, Oleg, for joining us today and talking about anything and everything from, from DevOps to AI to NODES submissions and so on.
Oleg Šelajev: Thank you so much for having me. It was really, really fun.
Jason Koo: Thank you, Oleg. I'm, I'm really looking forward to your talk. So it's, it's gonna be great.
Oleg Šelajev: Me as well. Me as well. Now there is a hole in the timeline. Now I need to prepare.
Jennifer Reif: Thanks everyone for joining, and we'll talk to you soon in another episode. Cheers.
Jason Koo: Bye everyone.