Today's guest is Michela Ledwidge, Creative and Technical Director at Mod. She’s a hands-on director who specializes in realtime and virtual production and runs a studio known for solving ambitious creative-technical challenges. Michela is passionate about storytelling and interactive experience design. In closing, we'll cover our favorite tools of the month and highlight events we're planning on attending in the near future.
Speaker Resources:
Tools of the Month:
Announcements / News:
Jennifer Reif: Welcome to GraphStuff.FM, a podcast all about graphs and graph-related technologies. I'm your host, Jennifer Reif, and I'm joined today by fellow advocate, Jason Koo.
Jason Koo: Hello.
Jennifer Reif: And as our guest today, we have Michela Ledwidge, creative and technical director at Mod.
Michela Ledwidge: Hey.
Jennifer Reif: She's a hands-on director who specializes in real-time and virtual production, and runs a studio known for solving ambitious creative technical challenges. Michela is passionate about storytelling and interactive experience design. So Michela, welcome. Thanks for coming.
Michela Ledwidge: Oh, thanks, Jennifer. Nice to be here.
Jennifer Reif: So do you want to tell us a little bit about how you got introduced to Neo4j?
Michela Ledwidge: Sure. I've been using Neo4j for about nine years. And it actually started, I was researching a documentary that I wanted to make, which had a VR component, and the idea, I don't know if you've seen An Inconvenient Truth, that the Al Gore PowerPoint that turned into Academy Award-winning documentary about climate change. I had this idea that I wanted to do a documentary on looking at dark money trails in Australia, and I thought it might be cool to actually explore the money trails in virtual reality as the presenter of the documentary.
So I started building a system that would allow me as a presenter to have my little portable virtual world, and we would have all of this deep diving into data sort of Minority Report style, like that old Tom Cruise movie where he's wearing his gloves and doing his police investigations.
Jason Koo: Great movie.
Michela Ledwidge: And at the time the idea was, because my studio does virtual production, and the idea was very much we'd take quite depressing, boring government data and make it more interesting to look and experience because I'm playing with it live in virtual reality. And then for any data journalists out there or activists, they could just get a Jupyter Notebook or some kind of data science tool that would allow them to access it.
And it was in the making of this project, this is an original project from my studio, it was all bootstrap funding, it was very under the radar, one of my developers pointed me at Neo4j. And this was just after the Panama Papers had won a Pulitzer Prize. Anyway, long story short, Neo4j kind of threw a huge spanner in the works because I kind of gave up, I didn't even bother doing the Jupyter Notebook in the end. We ended up building the entire production platform on top of Neo4j.
So what you see in the final result, and because of COVID, the plans to have it as a location-based experience that would travel around and I would do live shows, and it was more like a VR touring exhibition, COVID kind of killed that, so we ended up just publishing it on Steam, but you were basically interacting with a Neo4j database live.
So since then, we've had interest from various people, and we've spun out that core system as a plug-in called Grapho, and it's not quite ready for a consumer product yet, but we've been selling it as an enterprise toolkit, and we've been beavering away on our next original that's going to be using that as what we're calling a interactive graph mechanic.
Jennifer Reif: Well, that led very smoothly right into our next question, which is what is Grapho? But maybe just following up on that, what are the types of use cases, obviously the data exploration use case that you just talked about, but what are some of the use cases you're seeing people and businesses use with this?
Michela Ledwidge: Yeah, look, it's a good question, and I don't have a definitive answer yet because it still feels very early, but we've expanded what Grapho is to be beyond just a VR product. The Grapho brand we call data science plus storytelling. And the idea is that it's a very people-focused approach to, I guess you could say, a provocation that a one-size-fits-all approach to data science and graph data science in particular isn't going to work.
So as we all dive into these amazing automation tools and new pipelines, at the end of the day, there are still humans in the loop, there are still the need to customize, and I think the real opportunity of Grapho is identifying the gaps where the automated systems don't work.
So the use cases that have popped out so far have been, my original use case was I just wanted to grab two pieces of information and grab the link in between and talk about it with people. And that's actually quite a tough engineering problem in VR. It's easy to do it if you've got two 3D objects and one 3D object connecting them. That's fairly trivial, but to have a general purpose solution so that any graph can be opened, you don't even know what the data is and you can just go, "All right, I want to look at that specific relationship within a graph." That's actually quite a bit of work, and that's what we've built. And so that was very much data storytelling. That was the first key use case.
Then the second one for me is visualization. And that's how, so Neo4j has actually been using GraphoXR at its trade booths around the world, which is really exciting because obviously we're using their database, we love it, but they're using it specifically to communicate what a graph database is in the first place in a very non-technical way.
So those are opportunities that when we say data science, I'm very much an applied data science. I probably will never come up a new algorithm that will end up being published in the scientific paper. But our work has been published as how do you get this information in front of people in new ways? And I think that concept is always going to exist. It doesn't matter how amazing these new AI tools become, you're still going to want to see stories told in interesting ways. You're still going to need to present information in a way that's not cookie cutter or everyone else is presenting their information this way, so we need something to distinguish how we do it. That's where Grapho comes in.
Jason Koo: And Grapho is amazing. You had mentioned Minority Report earlier, but now that I think about it, many science fiction movies throughout the late-'90s, maybe early-2000s visualize data in a very 3D way. And what Grapho has done is kind of taken that sort of imagination, that vision, and made it like, oh, now I can actually interface with data in this way that Hollywood had been kind of showing for many, many years.
Michela Ledwidge: Yeah, it's not a new idea. And my whole career, I've kind of moved between science fiction and science fact, and I was having a conversation with a very prestigious data engineer a few days ago, and he said, "I don't believe you need to literally see graphs. The whole power of graph databases doesn't need to be exposed literally to the end user. It's informing, improving LLM's ability to give you good query results. That's what its value is." And I said, "Look, I don't think everything has to literally be in front of you. And the way Hollywood visual effects are created is for a very specific purpose, which has nothing to do with accuracy of the data. It's about telling the story points. The reason we don't use Jurassic Park style 3D Linux graphics managers is because they just don't work in practice, and 3D interfaces on 2D screens to graph data databases don't work terribly well either."
But I had an itch I wanted to scratch, and I put it out as an experiment. The experiment has now evolved to the point where for certain classes of graph data and I think for simple sets, it's very useful. And the irony is that while everyone loves these amazing complex data sets, us poor human beings can't actually process more than X number of things. So the ability to create a graph of 15 things, which are maybe summarizing quintillions worth of information, but you've actually got... you've boiled it down for an executive presentation. I personally think the value of being able to, before you make a big decision about interconnected data or interconnected relationships, having a baked in 10 slide deck, I don't think is as useful as presenting perhaps 20 pieces of information as a graph that you can literally grab and look at and digest and think about.
But I can't prove that until there's more usages of the tool. But I do feel like we're kind of exploring the value proposition like the mouse. The mouse itself isn't a big moneymaker, but if you don't have one handy and you're at your desktop, why not have this mechanism available for certain use cases, and it may prove useful. That's the way I'm proceeding with the development of the product.
Jennifer Reif: What I found really cool, because I actually had a chance to play with the graph through the VR headset...
Michela Ledwidge: Oh, great.
Jennifer Reif: ... when we were all kind of co-located for a meeting, and Alexander handed me the VR headset. He's like, "Here, try this." I was like, "Okay." What I loved about it, and I've been at Neo4j, I've worked here for several years, I've gotten used to explaining what graph is to people who don't know graph. And to me this makes it so much more real.
All their questions about how does this actually work, what do you mean you store the relationship, what does that look like? And I think just saying, "Here, look at it. This is the way we're storing the data and this is what it looks like," I think is so powerful. And as you said, humans are very visual beings, and just being able to see and navigate in a real space or a virtual real space, the data and what it actually looks like and what it's doing, I think helps solidify those concepts very quickly without having to go into a ton of words to try to explain it and make it relatable.
Michela Ledwidge: A friend of mine, Dale Harris Drag, was one of the advisors on the Minority Report film, some really interesting... There's a real brains trust that Spielberg put together for that movie. And he used a phrase in talks I saw him give many years ago about the wall of the new, and I love that phrase, partly because it's cop-out. If you're someone like me who's an artist and a director, a technologist, and who's building stuff, if you build something kind of original or you think it's original, people can't work it out. You're always looking for excuses, and then you have to go, "No, hang on a sec. It doesn't matter what you are trying to build. If the users can't actually pick it up, that's your problem. You've got to own it."
However, Dale pointed out that there is a wall of the new, there is a certain level to which when something is outside of your framework, it's so much harder to communicate until you've got more shared understanding. And I've found with VR in general, let alone something like GraphoXR, that it's actually a lot easier to teach people how to use the app to someone who's never used VR before than explain what on earth it is in the first place. And that's actually the most satisfying aspect of the product development at the moment, that in the context that Neo4j's been using it, I've observed 90% of the people who try it have never used VR before, let alone something like this.
So you're doing two things. You're saying, "Okay, here's a new medium and here's a practical application, a utility within that new medium." And once you've jumped through those two steps, then you can actually go, "All right, now what would you like to use it for?" That's really exciting to me is because I think XR as a medium is misunderstood.
My dad sent me a clipping from the Wall Street Journal yesterday, and it was talking about the big economic downturn and the difficulties in the tech industry. They had this great sentence in it about how companies are cutting areas that aren't money makers like VR and devices. And that's what I thought I would bring up a mouse because yeah, I mean, a mouse is not a huge money maker for most industries today, but until you're actually using a mouse in a practical sense, it's a bit hard to communicate what its value is.
Jennifer Reif: Yeah. And that's another aspect you have now touched on too is previously I looked at VR spaces and I saw them as kind of entertainment spaces. That's pretty much often what they're used for, right? And so seeing something that is business-oriented or, as you said, utility-oriented to use this at a capacity for learning or for showcasing what it can do or helping people understand data or the world around them, I think is super powerful and super interesting and not something that's really been explored a whole lot yet.
Michela Ledwidge: Yeah, I think that's fair. And I mean, I have my feet in both enterprise IT because that's where I started and as I've been running an indie studio for the last 20 years, so make no mistake, one of the use cases that's probably less relevant to your audience is as running studios.
Neo4j has gone from this boutique sort of niche technology stack to one of our core components of our studio pipeline over the last nine years. And I'm really excited about that because it means that as I get more proficient as a practitioner jamming with Cypher or playing around with these weird-ass XR interfaces, I'm finding myself training my team to go, okay, so we may have an entertainment experience where there's a data viz element, and of course we'll just use Grapho because we've got an Unreal Engine plugin that can make things look cool.
But the entire story universe of this title, which I've been working on for, say, 30 years, I've got topologies because I started writing it with RDF 20 years ago, and of course it's all been converted to a property graph now. You may know the brand.
So we're actually on our next title. We have a production graph, which is the sort of stuff that the Commonwealth Bank would be doing. We have the story world graph, which is how the characters relate to each other. There's lots of characters that have different identities. It's a bit complicated for some people just reading the script so they can actually look at the story world. And then we've got something that even the latest Spider-Man games don't have, which is as we redesign puzzles or change the chapter flow just by adjusting link relationships in a tool like Bloom, the game demo updates because we're using Neo4j as a game design tool.
And I don't know how many studios are doing that, but it's part of our special source that as a tiny little indie team, we get to do big projects because of the level of automation and procedural stuff we do. So the way this knowledge management sector is growing, the way in which GenAI is playing off better quality data is kind of music to my ears because as a nerd who has got our own studio to try and tell stories in a different way, I see a lot of convergence happening at the moment around good quality data.
Jason Koo: This whole production behind the scenes use of Neo4j that you just talked about, just opened up a can of questions from me. So I'll just start with one, which is in terms of using it for managing sort of the operations of the studio, is it then that your entire staff is trained to use browser in Cypher or did you build a dashboard on top or some sort of interface that kind of allows staff to more easily work with graph data?
Michela Ledwidge: Yeah, that's a really good question. And we have a very small team. One thing that Neo4j doesn't do terribly well is schema version control. And I come from a Python dev background. We're using tools like Django that have really kind of robust ways of your data model changes. You can basically track the deltas of how the schema is evolving over time. And there's nothing I've found within the Neo4j ecosystem that really is as robust as that.
So what I've typically done is have sandboxes where people can get comfortable with Neo4j. Because we are such a small studio, I am pulling the strings. As the writer-director of the product I just mentioned, I'm writing in scripts and I'm writing in graph as part of the script exercise.
So there isn't a large number of people editing the graph, but I've made sure that read-only interfaces to both Bloom and browse are available to all the devs and my virtual production generalist roles as well.
But I mentioned the puzzle design because my vision for this is that in the same way as a company like Naughty Dog that makes The Last of Us games, has very specialist tools for designing how the interactive storytelling and the puzzles and even the combat systems evolve for the purpose of a very specific job role, the level designer. I think the kind of business analyst use of Bloom maps really nicely at a very high level to how a level designer thinks in doing a game or a simulation or a training scene.
With the Clever Label, which was the documentary we released three years ago, we did have a bespoke front-end to Neo4j, which just presented a 2D desktop view of what was happening in VR.
Today I'd probably used something like NeoDash. We've been having a lot of fun building out cheap and cheerful widgets, and we have a new customer that I've just sold them on the idea that we'll just whip up some new react widgets within the NeoDash framework before we go to the expense of making completely bespoke, because Neo4j is throwing out lots and lots and lots of different ways to interact with your data.
Sorry, very long and sprawling answer to your question. I guess the way to sum it up is to say yes dashboards, so example, NeoDash, and also apply data science just through Jupyter Notebooks. I'm a huge fan of doing the R&D for our production pipelines in notebooks, just so that it's easier to share new findings and insights with a potentially less technical audience.
Jason Koo: No, that's super cool that you're using NeoDash so much, and it is really a really flexible tool. So your team, are you making PRs to NeoDash as well to add in sort of new multimedia widgets?
Michela Ledwidge: Not yet, but we have submitted some bug reports, and it's getting some heavy use via the Neo4j activations. And this project I mentioned, which I can't talk about yet, is going to give us, I think, the opportunity to do exactly that.
But I mean part of the whole GraphoXR approach was to get people thinking away from dashboards. So it's a bit ironic that I've kind of gone this whole rabbit hole into Neo4j instead of using, say, a Jupyter Notebook. And now I've come back to the point of, oh, for some use cases, that's actually what you want. But the most important thing is that we've got the right interface for the right audience.
Another use case for our staff is training and simulation. We have a really interesting simulation that we are working on for social workers working in First Nations Australian communities. I don't know how much you know about Australian Aboriginal and Torres Strait Islander culture, but you can probably imagine from wherever you've grown up, what indigenous communities who are sometimes disenfranchised and have social workers arrive to kind of help with issues come with no understanding of the environment they're coming into, and they cause offense and they break cultural protocols and all sorts of things.
So we're actually doing a behavioral sim, and again, we're going to use Neo4j as the way of structuring all the interactive narrative, but also the design of what is in the world of this sim. And the end users are going to be very non-technical. There probably will be a VR component, but everything we have to have a desktop equivalent.
Because it's so culturally sensitive, every decision we make within this training product is going to have to be checked by extremely non-technical stakeholders, some of whom won't have a desktop computer. They're in extremely remote parts of the country, which may have only got cell reception a few years ago. And so we have to think really broadly about, well, what's the quickest way to get digital data in front of people? And I like that sort of thing because it says, well, we might use some incredibly powerful high-end tool like Unreal Engine, which could... ILM's using to do the Mandalorian virtual production stuff, and we can do fancy VR stuff, but the same data has to be realized cheap and cheerful that it works on your phone or that can be printed out in a big red texter circle around it.
And so I think there's always going to be a place for dashboards, notebooks alongside XR. And part of what the non-XR side of Grapho is, is to make sure that we have the ability to keep our dashboards, notebooks, XR products across potentially every device because that's a real expensive part of this. Facebook brought out glasses yesterday. There's constant change in the marketplace. But I do think the success of Grapho is that we can say, all right, we're serious about storytelling. We've got some shiny things that are perhaps a little bit different, but we still have the bread and butter. We are serious about you understanding the story, so we come to you.
Jason Koo: So the three of us, we've experienced Grapho, and I'm sure by now our listeners are like, "Oh, how do I experience Grapho?" Because at least the version that we're doing is we're basically running a developer mode onto our devices. So Michela, can you talk to how someone might get a chance to experience this other than catching one of us at an event?
Michela Ledwidge: Yeah, definitely. So we are planning a public demo in the next couple of months. I'm actually hoping to have it next month. We'll have a downloadable demo that is just hardwired to six or seven test graphs, and that will probably just be available as an APK that you can download in developer mode to start with. And we're in the process of looking at our publishing strategy.
The VR marketplace is really tricky. We probably will end up publishing it on the Facebook/Meta marketplace, but we haven't for various reasons. And it's partly because we've been trying really hard not to be locked into one particular ecosystem, because as Jennifer said, I mean, I'm really serious about this being a tool, a utility, and not just a quick buck on. I'm using this for production purposes within my studio.
I think the first stage of the release, we'll make an Android file download for developers who have the gear and have developer mode to play with, and then we'll follow that up with a direct to consumer release.
But it's actually quite difficult. The whole developer mode is a bit of a debacle in terms of it's not an easy space to be. And I think any VR developers on this show would say the same thing. It makes the job of just bringing out a small simple, like a notepad style utility, and just okay, it works on everything. We've had to focus on the Meta Quest product line recently, and look, they're amazing hardware, absolutely amazing hardware, but we have some real difficulties getting our app to all the hardware. We have to constantly make compromises.
We've previously had a demo that works on everything, and we've Meta's latest Quest about to launch, we're just waiting to get our hands on it and make sure that it supports it. And we've also been priced out of the Apple Vision Pro thing. We haven't had any customers in that space yet, but in theory, our product should just work on Apple Vision Pro, but we're trying to... been holding off doing a public demo splash until we're confident that it can work on the most expensive and the cheapest in the market. And that's just such a moving target.
Jason Koo: Sounds like a lot of work. You did mention earlier on, I think you had mentioned an earlier Steam app that kind of predated Grapho. Is that still available? Is that something people can start with?
Michela Ledwidge: Yeah, it is. Yeah, so Clever Label has been available on Steam for three years, and it's very recognizably Grapho. It literally evolved. I mean, the topic was the dark money patterns that were basically holding up LGBTQI rights in Australia because we had a big referendum on marriage equality, a plebiscite, and the documentary starts with the question, well, why did it take so long to have it because the Australian public was in support of this 10 years ago? And then it looks at which organizations and specific people funded all of the blocks.
So it wasn't a very popular project, and it's very easy to get sued in Australia. So what I did was just only use government data, and then I used GraphoXR to make it much more interesting than just looking through a government website at financial disclosures.
It was quite a polarizing topic, and I had a lot of people say, "I'm not so interested in the topic, but I'm really interested, how do I extract your interaction mechanic out of it?" So again, that's the story of Grapho, that it actually started as a very non-corporate IT exercise and is now the fact that we've now moved into an enterprise IT tool space. We've been able to separate content and data specifics entirely from the tool, but the origins of it was this A Clever Label.
So it was a forensic deep dive investigation, and I'm hoping that APNIC, our first customer, will also release a tool eventually because they commissioned us to build a tool that lets you look at the state of the Australian Pacific internet itself. You could literally open up a map of the region, select Fiji, and then the topology of the Fijian internet would spawn in front of you. All of these kind of low-level protocols that help manage the internet were exposed in this 5 million strong graph, which was used by forensic network analysts to actually maintain the Asia-Pacific internet. So we were very happy about that being our first customer. I'd love to see that pop out.
Jason Koo: That sounds like a very cool visualization, a lot of data too, depending on what level you're looking at.
Michela Ledwidge: And it was non-controversial. I went from doing this controversial art documentary to a topic no one was going to complain about, it was the internet. It was like, "Oh, no, what's your next project? Oh, it's about the internet. Okay, cool. Yeah, all right, I'm interested."
Jason Koo: Cool. Oh, now, so you do have a talk coming up, Michela, at our NODES conference in November.
Michela Ledwidge: Yes.
Jason Koo: Did you want to talk a little bit about that presentation, a little sneak peek?
Michela Ledwidge: Yeah, sure. So basically the aim of the presentation is to do the opposite of this and just show, show, show and talk about, I think it's only 15 minutes, but I love using GraphoXR on stage and in public presentations because people get excited when they can see, "Oh, I can do that. Oh, she's got a computer science degree, that probably helped make this thing in the first place with a small team." But it's not complicated to use as you know, you just reach out and grab stuff and do it.
And so the aim of the talk is just to show it has evolved, and it's mainly in terms of polish. I mean, there's just been a huge amount of under the hood changes to try and make it work better. And so we're racing to get this new public demo. I'd love to say that the NODES presentation will be kind of like, this is what you can now go to this link and download it. That's what I'm aiming to do.
Jason Koo: Nice. If you can say, are there any new features you're excited to be adding to it? I mean, just making it publicly available, that's already, that's a huge win, but since I've used it already, I'm kind of curious like, oh, what's the next feature that's coming out?
Michela Ledwidge: Yeah. So this particular round, all the improvements are under the hood, like a more optimized solver. This week I'm so excited to have a new problem, which is we actually have to slow down the speed which the graphs appear, because until now it's been kind of... on the Android version at least. I mean, we're trying to do all this on equivalent of a mobile phone.
So as you can imagine as the graphs... there's only so much data we can process in the app as opposed to on the... We've got a PC version, which can handle huge data sets, but for the mobile VR version. Until this week, the speed at which everything spawns out was kind of partly related to how much the application could process. Whereas now it's like, oh, actually it's coming out too fast. It doesn't look as nice as it did because we've optimized. We don't just put stuff out as fast as the computer could throw, so now we have to wind it back, so it still has the aesthetic that we like.
So I'm really excited by that. But that's the big one. We had video support three years ago, and we've kind of had to roll it out because we just didn't have time to focus on it. So that's coming back.
I put in support for all audio formats last week, which I was excited about. That was important to me just because we want to be able to deal with any multimedia links in a sensible way that are in the graph. You probably remember those data pads, so very much inspired by the humble iPad. So it's almost like you've got an iPad size object that appears when you grab a node, and we have kind of widgets that will... a bit like the NeoDash widgets. They're called widgets, aren't they? I think [inaudible 00:32:23].
Jason Koo: [Inaudible 00:32:23]. That's what I've been calling them.
Michela Ledwidge: Yeah, yeah. So basically, we have a similar logic in GraphoXR that depending on whatever the content is on that, the properties are on the node, it will just automatically use an appropriate widget. So for example, if there's an image URL, it will default to just showing that image. If there's a video URL, it'll show that video. If there's subtitles, it'll automatically play the subtitle text as well as playing the voiceover.
So there's all this additional stuff, but we're now at the point where we want to give the user or the developer the ability to choose. And A Clever Label also had two-sided data pads, so you could kind of look at this information and then if you wanted to see, well, did they just make this shit up? No, actually I just turned it around, and we've got all the references to where this came from.
Again, Jason, this is why we haven't released it as a consumer product because in terms of crafting the story for the right audience, we've got all these tools, and what's most important is that it all comes together for the right audience. But within our resources at the moment, we've chosen to focus on let's try and get the most powerful illustrations of the tool out first.
I mean, Facebook wanted us to do the opposite. They said they weren't interested in Neo4j support. They liked the idea that it was almost like a toy tool. You could build your own graphs from scratch. And I just said, I'm not really interested in that because there's so much data out there. The whole point of this is not to be able to build your own graph from scratch. We can do that super quickly in a tool like Bloom or write five lines of Python. We don't need yet another editing environment for creating the data. I want you to be able to manipulate and work with the data you already have. I need to work on my length of my answers. They're a bit long, aren't they?
Jason Koo: No, they're great though. But I mean, you're answering it in depth and giving real insight into kind of the history of all the development and really defining why it can't be just immediately a consumer downloadable app, which is the first thing I was wondering is why do we have to side load it? Ah, okay, that totally makes sense, right? You're actively doing a lot of stuff, and you're right, when you generalize tools like this, it is a lot of effort. It's more effort than just going in and just kind of updating a few lines of code and then recompiling, so.
Michela Ledwidge: Yeah, it's a philosophical thing, but it's also the system that we're working within. We haven't been actively seeking funding for this. We've been seeking funding for projects using it, but it's a slightly different focus. I'm very proud to run a genuinely independent studio, and there aren't many out there, as you know, and it's a tough time for game developers across the board, but running an indie studio means I have the ability to say, you know what? We came up with a tool in the context of this entertainment production or this corporate, and let's take a moment and think about does this tool have broader utility than the product that we've just made? We didn't set out to make a Neo4j compatible spatial browser for graph data. It just evolved out of production needs. And I think production technology has been burnt time and time again by really valuable tools got acquired and then they get killed.
For example, the Lord of the Rings films, the compositing tool that was used primarily on that was a tool called Shake, which was then bought by Apple, and then it was canned a few years later. And we see that time and time again. So I kind of want to give GraphoXR this VR thing that seems to be finding some audiences. I wanted to give it a bit of space to breathe, and ideally we will spin it off as its own company, and then that whole exercise of making a robust consumer edition can start in earnest.
But my role as the product owner and the architect behind it is this is a tool that I've invented for my own purposes that other people seem to like, and I want to continue using it. So it has to be in a form that remains valuable to folks who have real end user applications and not just the abstract notion of, yeah, we can sell X many units. I've really got to focus on the early customers.
Jennifer Reif: Spoken like a true developer and artist. Yeah.
Michela Ledwidge: Yeah. I mean, it's exciting. I mean, I was handcrafting all my graphs in Bloom a few years ago, and now we're just generating everything with a couple of lines of code and the LLM hookups we've got using fine-tuned original models for the first time this year, and I'll pitch it here because I know this is an enterprise audience. So just imagine, this is one of the products I'm working on, just imagine that in a year's time while you're listening to this, you go and you download the GraphoXR demo because you're interested in the burblings that we did, but you're still listening to this podcast, but what if you went to the graph for this particular episode, and as Jennifer speaks, as Jason speaks, as I speak, the actual information that we're describing starts to appear as nodes, and so by the end of our listening to the podcast, you have got a graph of this episode?
So it's not just bookmarks or show notes. You've actually got all the relationships between what we are talking about, every single nugget that we mentioned, whether it was history or product names, whatever, it just appears.
So that obviously is achievable. It's something that I think has got some value as an evolution of show notes. And we've got all the building blocks, we just haven't had time to really focus on. That's one of my plans for next year to be able to, in the same way as we've got a fairly robust expo registration system that companies like Neo4j can use to create conversation starters around their graph data using VR. What if you just had a web page that just the graph started building live based on audio input or based on video input?
Jason Koo: That sounds like a super fun way to walk people through a presentation, right? It starts with a single nugget, and then by the end of the presentation or the talk, you've got this mini knowledge Grapho that you could just hand someone like, okay, you can now use that actually if you wanted.
Michela Ledwidge: Yeah, and it's hard to argue its value until you try it. But what Jennifer said earlier about people trying to understand the concepts of a graph database from reading text information. We've got all these amazing multimedia capacities, and I think it would be a real shame if whether it's corporate IT or entertainment, there's a certain level of boldness required to try and come up with new formats and new mediums within all this stuff. But the reason folks like me do it is because it's exciting. It doesn't always work, but when it does work, there's nothing more satisfying than having these kind of conversations.
I should throw in one last use case that I've discovered, which really relates to what Jennifer was saying. So I discovered through the making of A Clever Label that what I thought would be the target audience for this documentary, people who understood what was going on behind the scenes in politics between LGBTQI organizations and governments and fundamentalist religious groups, I thought the people who are activists in that space might be interested in this product, but they weren't because it's all a bit depressing to them.
And it's like, we know all that, but when friends would be polite and putting the VR headset on and look at the data, I observed something really interesting, which is that one particular guy goes, "Oh, yeah, I know that, and I know that. What's the link?" And just watching this person who'd never used VR before, go, "What's that?" And reach out without being told that you could do it, reach out, grabs the line, and it shows the dollar value of how much this particular high profile individual had put into this organization that had basically been blocking the user of this experience's human rights. It's like, "Oh, I didn't know that."
And so he's literally holding in his virtual hand a new piece of information that he'd only discovered because that relationship was kind of highlighted to him, not because we'd made a fancy video or a PowerPoint that stuck it in the center of the image, but just him moving stuff around in a kind of haphazard way, he'd made a discovery.
So it's almost like the textbook definition of new insights. And I think that's my goal with all of this, to get the right relationships, the right nodes in front of people for them to then say, this is valuable. And so we have to do a lot of boring engineering to make it look as polished as it is. And the argument in a way is not to just keep throwing new features at it because then it just becomes too complicated and overwhelming, and we're back where we started.
Jennifer Reif: One of the most powerful things about graph in general is just pouring some random segment of data into it and then just kind of exploring and playing around and seeing what you can find. And of course, we typically do that through Cypher or some other means, dragging and dropping on a 2D screen, but doing that in 3D I think is at another level and just yet another aspect that hasn't been fully developed or explored yet.
I think on that note, we could probably go all evening or for you, all day, but we'd probably better head into a little bit of Tools of the Month and wrap up on this super fascinating topic. And if you're interested in hearing more, Michela will be at NODES 2024 on November 7th, so be sure you register for that and check out her talk there if you want to learn more about the tool and everything that they're building at Grapho. Who wants to go first on Tools of the Month?
Jason Koo: I think we should start with our guest.
Michela Ledwidge: I've been a bit cheeky today and kind of gone down some rabbit holes because I thought I really need to demonstrate the value of humans on a podcast after an experience I had earlier this week. Google launched this new learning assistant research tool called NotebookLM, and without knowing anything about it, I went to this webpage and I pasted the URL.
So we haven't got a publicly a downloadable demo, but we have published the user guide, docs.grapho.app, and I just basically... no one reads user guides or manuals. I just thought, I'm going to just throw the URL of our user guide, which I've slaved over maintaining for several years into this tool. And there was a button that said create audio conversation. And within a few seconds, I had a downloadable nine minute, basically a podcast interview between a GenAI-created American sounding human male and American sounding female talking about GraphoXR for literally nine minutes.
And it's not perfect. It does sound like an American infomercial, but I was quite blown away by the fact that a bit like what we were doing today talking about use cases, they were bringing up kind of more sort of bread and butter use cases, explaining how your product sales are going, all this sort of stuff, which I kind of find a bit boring. They were bringing out these really valid use cases for the product that we've made that we've never ever mentioned to anyone, which you probably should mention because there's probably money left on the table by not telling people how they can just try and make money using their data in this way.
And it really showcased that the human connection in conversation, it's valuable. But yes, if you're sounding like a template, there's going to be systems to replace you. So fortunately, you two are not going to be out of a job because this is way more interesting and variable than what the systems can generate. But yeah, it's an amazing tool to play with.
Jason Koo: Super cool. Yeah, I want to start playing with that now. So kind of off of that in a kind of similar vein, my tool of the month is the HeyGen app or HeyGen service, which basically allows you to create digital avatars of yourself or use pre-baked avatars to use for whatever, for creating social videos, for training videos, basically kind of help you kind of be more productive in the video space. And I was quite impressed.
So you just submit a two-minute video of you talk and saying a script at the video and moving your hands and stuff, and it makes a pretty good facsimile of you. I've tried it and also done a couple of the language options. So I've heard myself speak a very poor version of Austrian German, which got a lot of laughs in the house, but it's interesting to see how far that technology has come.
But you're right, it's gotten slightly past the uncanny valley, but it's not as dynamic as real people in a real conversation, right? It's like that Google ML thing. If you listen and watch long enough, it's kind of flat, right? It stays to a very narrow range of emotion and interactivity. But if you're interested in trying out one of these tools, I would say give HeyGen a try. It's been quite impressive so far.
Michela Ledwidge: Wow. Yeah, definitely going to have a look at that one.
Jennifer Reif: And to deviate off that, I write a lot of Java code, and I've always defaulted to using IntelliJ IDEA. They have a few other IDEs out now for various languages, but their IntelliJ IDEA rendition does Java and Kotlin. It's a really fantastic tool, has been for a long time. They actually recently had an update to it, and I updated a few days before presentation I did this week, and it was kind of funny because it's so much more intuitive about the AI coding tools now that there's a lot of things that it suggests for you, and it's like, "Oh, yeah, that's exactly what I want to write." And so you'll do that, or you'll select something, and then it doesn't quite get it right. So you do have to go back and verify, but it's really nice when you're typing tons of characters for this big long block of code to have it go, "Oh, did you mean this?" "Oh, yeah, that's it." So then you select that, and it just populates the whole line or the whole block for you, and it's so nice. I don't have to type all that out manually.
So yeah, definitely beware, don't just go on autopilot and just select, select, select, select, because you'll probably have some stuff when you try to run it, it just falls apart and it breaks and you go back and you look at, "I didn't write that." Well, no, you didn't, but you selected it, the AI thought it was doing a good thing.
So there's some really cool capabilities. You do have to really focus and pay attention to what's actually being dropped onto the code. So cool thing there. But if you're Java or Kotlin developer, check out IntelliJ IDEA. If you're not Java and Kotlin, check and see if they have an IDE for your specific language. But really fantastic products and lots of customizable features there.
Michela Ledwidge: Great tips. Thank you.
Jason Koo: Cool. What events will everybody be at this month?
Michela Ledwidge: We've got a really great milestone coming up for our little team. Actually I'll be at GraphSummit London on October 16, and for the first time ever, we have a second event simultaneously where Sarah's going to be running a GraphoXR experience at South by Southwest Sydney here in my hometown, so October 16, we've got simultaneously London and Sydney Grapho stuff, and they're both free, I believe so.
Jennifer Reif: Nice. That'll be a really eventful middle of the month for you guys.
Michela Ledwidge: Yes, yes. No, it's really good. It's having to test our process as well.
Jennifer Reif: Yeah. I will be at dev2next in Denver, Colorado, next week, mostly focused around Java ecosystem there. And then I'll be speaking on a meetup in New York City. I believe that's the 22nd. I'd have to look, but I'll have the meetup page linked and the event will be posted there. So NYJavaSIG is the group there, so I'll be there in the middle of the month for that. Jason?
Jason Koo: For me, I've got San Diego Startup Week here locally, that's going to be happening the third week of October. So that's a week long startup event, a lot of tech talks and a lot of folks in the technology space in San Diego and a few folks that will come in. So that's going to be my big event for October.
Jennifer Reif: It'll be a busy month for all of us, I think. Well, to kind of wrap up, thank you so much, Michela, for coming on and talking about all of the projects and the use cases and some ways that people can start interacting with some of the tools and products that you use. And then of course, we will definitely be looking forward to seeing you at NODES.
Michela Ledwidge: Thank you, Jennifer. I really enjoyed the conversation. Thanks, Jason.
Jason Koo: Thank you.
Jennifer Reif: All right, see you everyone next month.
Michela Ledwidge: Bye.
Jason Koo: Bye.