Episode 10

In this episode of Reimagine Ownership, hosts sit down with Stephen Gay, Co-Founder and CEO, and John Finger, COO, of ModeMaison Labs, to explore how their innovative digital platform is transforming luxury retail. They discuss how ModeMaison Labs blends cutting-edge technology and immersive storytelling to redefine digital ownership, the future of personalized and sustainable luxury goods, and the role of AI in shaping the industry. The conversation also covers the challenges of evolving traditional retail models and the potential impact of generative AI on commerce.
Re:Imagine Ownership Podcast
Author
Metaversal
Published on
February 26, 2025

Key Topics Discussed:

  • Stephen’s journey from Ralph Lauren to ModeMaison Labs
  • Challenges in traditional retail storytelling
  • The evolution and future of digital retail
  • Understanding Digital Core and its impact
  • The role of TMAC in digital product creation
  • Ownership and the future of digital twins
  • Royalty models in the digital world
  • Building a talented team at ModeMaison Labs
  • The importance of proprietary data
  • Exploring world models in AI
  • The future of AI in commerce and retail
  • Personal reflections on ownership

Notable Quotes:

"Luxury is about storytelling, and digital ownership allows us to extend that story in new ways." — Stephen Gay

"AI is not just a tool; it’s a collaborator in redefining commerce." — John Finger

Key Takeaways:

  1. Luxury retail is evolving—AI and digital platforms are enabling new forms of personalized and sustainable luxury.
  2. Storytelling is at the heart of digital ownership, and companies like ModeMaison Labs are pioneering immersive experiences.
  3. Generative AI will reshape commerce, offering new ways to design, sell, and own products.
  4. Proprietary data is the future—companies investing in unique digital assets will have a competitive edge.

Guest Bio:

Stephen Gay is the Co-Founder and CEO of ModeMaison Labs, with a background in fashion and digital innovation. He previously worked at Ralph Lauren, where he honed his expertise in luxury brand storytelling.

John Finger is the COO of ModeMaison Labs, specializing in the intersection of technology, commerce, and digital ownership. His work focuses on creating seamless, AI-driven experiences for the future of retail.

Connect with ModeMaison Labs:

Website: modemaisonlabs.com
LinkedIn: ModeMaison Labs

Connect With Us:

X: https://t.me/hellometaversal
LinkedIn: Metaversal
Website: https://www.metaversal.gg

Transcript

RIO E5 Steven Gay and John Finger

[00:00:00] Welcome back to the podcast. Super excited today to have Stephen Gay and John Finger joining us to talk about Mode Maison Labs. Thanks. Steven is the co founder and CEO. John is the COO. Integral pair mode. Maison is a groundbreaking digital platform.

That's transforming the world of luxury retail. The company is at the forefront of blending cutting edge tech with immersive storytelling and everyone that's been following metaversal knows that storytelling is at our core, but mode. Maison is creating a fully digitized shopping experience. That's nothing else, certainly nothing else we have seen, and I don't think anything else that you'll have seen.

It's the first platform of its kind and building a foundation on [00:01:00] digital assets, harnessing. You guessed it. State of the art AI to craft what they call digital DNA. It's a revolutionary approach that captures the essence of luxury products down to their smallest detail. And by pushing the boundaries of personalization and sustainability, Modemaison is redefining how luxury home goods are discovered, how they are purchased, and how they are experienced.

Guys, welcome to the show. Thanks for having us. Thanks for having us. I want to jump in, Stephen. I know that you came from the fashion industry. You were the youngest concept designer at Ralph Lauren, not too long ago. I want you to tell us about that experience in retail and how it became the basis for building.

The digitized generative retail model of the future. [00:02:00] Yeah, so it all kind of started at Ralph. I was put out of school a little early to join the concept team there. And the whole kind of goal with the very small team that we had was to build kind of like these dream worlds where then the products would kind of be designed into.

So while there, it was constantly about, you know, building these beautiful worlds so to speak, and we'd be doing it physically in concert rooms and just constantly pulling inspiration and, and, and just kind of like again, building to what we call like movies or stories or worlds. What's an example of that?

Is it the winter wonderland theme for Ralph in Colorado? Is it the Texas? You know, cowboy theme. So I was always partial to the kind of the Western cowboy theme because I'm from Texas, but we won't hold that against a Georgia O'Keeffe kind of like beautiful scene in the desert. And kind of like a movie and the story kind of that revolves around that.

So you know, creating the textures, [00:03:00] the color palettes the kind of the look, the vibe, the feel like who that person was. And, you know We could only go so far. And so we've constantly building those worlds. And then, you know, I think Ralph does it better than anybody in terms of building out these, these stories.

And so we'd have these huge concept rooms where in some of them, there would literally be like planes, like quite literal planes. They would reconstruct. So we try to recreate these incredible. I keep saying that, but like kind of that you get immersed in, but again, you can only get so far. So kind of taking that kind of like inspiration, insight and then kind of like knowing what was out there in terms of what you could do digitally, I thought, all right, there's gotta be a new way where we could actually just create these.

Beautiful dream worlds to the nth degree, just push it that much further in, in the kind of the virtual world, because again, like the sky's the limit when it comes to what you can do digitally. The kind of the thing that I would say is that if we wanted to put Jack Cousteau on a chair on Mars, we could do that.

I don't know why you would, but we could do that. You [00:04:00] might, I want to just level set because, you know, I find this part super interesting. Which is the historical way that these brands are doing. Storytelling feels a bit like mini theater, a bit like mini cinema. Ostensibly, you're trying to convince a shopper like John. To envision himself in that scene that he feels inspired, right?

That's aspirational. He wants to be the cowboy. John has always wanted to be the cowboy. I should say, since we didn't during the introduction, John and I know each other for over 20 years because we went to college together. And we were fraternity brothers. In fact, he was president of the fraternity and I was the bursar, which means I had to pay all of John's bills and they were large.

There was no, there was no waste or abuse, but they were large bills and he was [00:05:00] running a great operation. So that's how we know each other. So I put him on the spot saying what inspires John to go and make that purchase. So Stephen, maybe you can give us a little more insight, the traditional way things.

It's a lot of time, a lot of energy, a lot of personnel and a lot of cost. Yeah. So there's several things there. So first of all, you're, you're just assuming that there's only one movie going on in any given season. There's typically four or five or six collections, so to speak. And that's within men's and that's within one brand.

So then you have women's, you have children's, like it just, it, the costs are almost untenable. Like when you think about it we're talking. You know, tens of millions of dollars. And also we're working typically in the fashion kind of a world. They're typically working 2 years prior to the actual season going out.

So it's just so inefficient. People have been trying to work on this problem for a long time. So not only just the talk about trying to predict where the puck is going 2 years in advance, [00:06:00] you're already concepting what john will want to wear 2027 exactly. And on the on the experience side, it's also just it's so limiting in terms of the physical limitations of how far you can go within the kind of the realm of storytelling and kind of unlocking that creativity is just so limiting.

So there's just so many constraints in the physical world. And. It's almost just a nonstarter when, when we're talking about the amount of content that's going out today and kind of these, these, these dream worlds that the brands need to create it's, you're really up against a lot. So if you can basically the kind of the thesis is, if you can basically do all of that, but in a, in a computer or, or via digital experiences then it's almost like the holy grail, because not only do you eliminate costs, which is really not the whole goal for us, it's an incredible byproduct.

The whole goal is just to enable more creativity. That are not basically limited by your physical, physical limitations, right? And to, and to, to better understand that if you were on [00:07:00] location, taking photographs or video filming something, you have a certain set amount of time. There is models, there's actors, there's a rented space, there's food and catering trucks, right?

It can't go on forever. So your ability to iterate. Is likely limited in the physical traditional way. If I, if I understand what you're saying correctly, the ability then to do this digitally, you're only limited by your imagination. John can sit there putting the inputs, clicking the button as many times as he desires until he gets what he considers is the perfect scene.

To be able to position that product and inspire that shopper 100 percent and that's so we'll probably get into in a second, but even the act of clicking and creating that scene iteratively is something we're working on doing away with entirely. So, there's a barrier there too, because if we just move the barrier, like the physical constraint barriers in the physical world, you could maybe make that same argument.

Not maybe you could absolutely make that [00:08:00] same argument when it comes to actual the digital barriers, barriers in terms of the education needed and all the tooling and all this stuff, the tools, just the workflows. It's really there. It really is limiting in terms of like, how much content you create in all these dream worlds.

You can create, you know, In a digital atmosphere it's maybe more complex honestly in the digital realm. So you have to rethink, we've had to rethink basically everything from the ground up. It's a big undertaking. And I know you mentioned cost, right? Tens of millions that are, that are typically spent for large brands.

I have heard anecdotally that just one photo that I might flip through in a magazine. Typically cost 10, 000 way, way, way more. Hold on. Yeah. So we're talking value shopper. I'm a value shopper. So maybe it's the places that I shop. Yeah. I don't know where you're shopping. No, no, we're, we're talking like millions of dollars per photo shoot and you're getting, you know, it's single assets.

Like we're talking, you know, couple assets that you can actually use. So it is just, it doesn't [00:09:00] make any sense. It's even crazy for us to talk about a single frame or single image or asset, because what you see is what you get. That's what, that's it. When you create things digitally in a virtual world, it's infinite.

So let's pick up on that and talk about mode. Maison's digital core infrastructure, which is really the underpinning of the digitized future that you guys have been working on and that you're describing here with us. So can you share for all of our listeners, what does digital core mean? Cause cause that's a made up word.

It's a term of art and how. Is digital core allowing brands to take ownership of their assets in a way that really hasn't been possible before? Because you're already touching. Yeah, so I guess just to back up real quick before we get to digital core essentially, the way we see the future of retail or commerce, or just really just everything going is that we really think it's going to be an interconnected web of, you know, digitized systems, [00:10:00] products, processes, experience is and capabilities wholesale.

It's going to make up like kind of our lives going forward. And so in order to enable a fully interconnected you know, system or ecosystem where you can have digital assets and actually create those digital assets in the real world, you kind of have to take a very first principles approach. And this is kind of where the genesis of this idea of digital or came from.

So right now, everybody's chasing this problem of digitization at several different layers or different altitudes. But what nobody's really done is, is somebody needs to come in and just define the most basic infrastructure to be able to scale everything else up. And so right now you have all these different tools and different workflows and different technologies that basically are entirely subjectively driven.

And they're not based in the real world. They're not based in real world physics. And if you want to create products in a computer. And you want to [00:11:00] scale that up across different ecosystems and then ultimately make it in the real world. Well, you need real physical, real world data to be able to leverage and to drive all your AI going forward.

I just, I just want to jump in real quick because at the top, Dan, you talked about luxury and he talked about home goods a bit, but just think about as we speak, think about. Just materials wholesale, whether it's a shirt or a sofa or a metal, this is quite, quite literally, this is relevant for any material at all.

So when we talk about digital DNA, digital core, it is an infrastructure that is entirely just like that. Whatever that is, that is a man purse. Yeah. So digital core basically is the representation or the simulation of physical materials. In the virtual world. So if you have a fabric in the real world, it's taking a sensor basically taking all the data from from physical sensors and then [00:12:00] bringing that into computer that the analogy I'll make here is, is kind of similar to Tesla.

So, before Tesla or autonomous vehicles were widespread, everybody tried to tackle the problem of autonomous vehicles or autonomy via a hard coded approach. So just be algorithms alone. But as we know, there's an infinite amount of scenarios that a car could come across. Infinite amount of grandmas can be walking across street or deer or whatever you say, there's just infinite variability.

So it's, it's really an impossible problem to solve if you, if you don't leverage real world data. So then when, what Tesla did essentially, and why they made this huge breakthrough was they embedded sensors in cars were able to leverage the data they captured from sensors. And then create synthetic data sets and basically these large foundation models to then drive all their simulation and all their capabilities going forward.

That same thing is needed for everything outside of autonomous vehicles. It doesn't just stop at autonomous vehicles. It's, it's more needed across everything else. There's an infinite possibilities of materials in the real world to be specific. And [00:13:00] so right now, the status quo is, okay, let's create materials in a computer, but via code.

Okay. Well, there's an infinite amount of materials to so in order to kind of like level set in order to kind of like, you need a very first principles principles approach. And that's kind of where this idea of digital board came out and it's taking a 1st principles real world embedded sensor approach.

To driving the future of digitization and simulation. All right. So you guys have a hardware component the total material appearance capture system, T Mac, but you're not alone. There are other groups out there that have high fidelity scanning devices. So I want you to explain a little bit of the differentiation because I know your CTO in Europe, Yacob has spent a lot of blood and treasure on the Developing the team and you have deployed it.

You have deployed it already successfully for a number of [00:14:00] years with major companies that are indeed utilizing the way you're describing getting ready for this digitized future. Give us a little color about that component. Yeah. John, do you want to touch on that? So the one differentiator I read off the bat is, is size.

It allows us to create a higher fidelity. Maps and basically think about what do you digitize and it's very easy. I'm not I'm not a technical person, but it's simply if you put a a swatch of material on the device, you press a button and then you have seamlessly tyable maps that help comprise. So imagine the DNA of a certain material.

So one differentiator is size. You know, typically you're you're capped at a for a 7 paper size. Ours is 1. 3 by 1. 3 meters. So it's much more flexible. And I think we would stand by the fact that our outputs are that much better. So we're closer to the reality than [00:15:00] having somebody push a button and then a person on a team having to spend hours, if not days, entirely subjectively, to create that material.

I may be doing it one way, the person next to me may see it differently, may be doing it a different way. We're all trying to get to the same output, but really what you end up having is the same output. Multiple variations of the same thing, whereas we're taking subjectivity entirely out of the picture, you press the button, it's an objective view of it, and, and that's how we're thinking about it.

Is a realistic version of this shirt, as an example, that captures all the curvature, the way the fabric moves, right? That's part of of this unlock. Am I right? Steven? So, yeah, we want to be able to design products. Inside of computer where we can limit all sampling and all kind of physical costs associated with the designing of a, of any product because it Ralph, you know, we'd spend millions of dollars every season, just in sampling costs alone.

Also [00:16:00] let's talk about sustainability implications. So if you can basically eliminate all the, the overhead in terms of sampling costs, but when it comes to the amount of burn of fabrics that never see the light of day in production we're talking huge elimination of kind of the, the, those fabrics that basically just end up in the landfill.

And enabling kind of designers to design products inside of computer first. So that's really important. And then also, ultimately, you need a product that you can actually sell in a computer. And so that requires an incredible level of fidelity because at the end of the day, if, if I'm a designer, which I, you know, used to be and it's not looking exactly one to one how it's going to look in the real world, whenever it's produced.

Well, then it's nonstarter. So it's a really hard problem to solve. But in, but I think that the future there and the, what that will enable and what that means is pretty huge. So what we're trying to do is basically enable product creation inside of computer and then enable experiences and people to be able to sell a product or to see a product before anything has ever been made.

And so [00:17:00] that kind of transforms retail and flips it on set where you unlock a fully digitized, what we call PR, what people call PLM product lifecycle management system. Or supply chain. It eliminates all skews. I mean, it eliminates, not skews, it eliminates all stock. There's just a whole lot of implications.

It's, it's, it's quite, I mean, you know, I'm, I'm newer to this space than Steven is, obviously, but it's quite archaic and it's really surprising given all the technology that's available and all the breakthroughs we're doing across, you know, is it surprising? They've always done it one way and people keep buying.

So they continue to do it. Environment be damned. Let's just produce the material and we'll burn it. Right. It's terrible. Yeah. But again, like it's just, it's just, it is so time consuming and, and the amount of people, and again, we're talking two years production run basically in terms of or, or, or design phases within fashion.

Why is that? The question is, why is that? It's because of all the shit we're talking about because of all the stuff we're talking about right now in terms of the actual designing [00:18:00] and sampling and, you know, back, it's just, we can eliminate all of that.

So I want to move on to, you know, the first time I met Steven. It was through a mutual acquaintance, someone who is very experienced in the retail space. And I recall being wowed and inspired by the breadth of Stephen's vision. He's a real you know, change maker and dreamer. John, the reason I think this pairing was was particularly interesting is John is a realist.

So, John, when you think about. You know, the uptake, what has been your observation? Companies are already beginning to engage with ModeMaison Labs. We know it's still early days, but you're in front of them. What do you see is happening? And I'll put that question to you. [00:19:00] And then, Steven, when you talked about material.

And the importance of where the puck is headed. I recall Jensen at Nvidia talking about the importance of material and proprietary data in one of his big product confabs. So maybe you could just remind us why he has said that that's going to be so vital. Do you want to do it in reverse order?

Yeah. So that's one of the things that Jensen always talks about is that is, is materials We've been talking about it since day one. That's the whole reason we, we, you know, day one being several years ago. We think the materials unlock everything ultimately downstream. So the world is comprised of materials, everything that we interact with this all materials.

And so we think that that's why digital core is so important. So if all products are made out of materials and even our world and everything's based in physics, then whoever has that data and whoever can recreate materials, that's physics space [00:20:00] that actually is based in real world physics inside of computer.

Unlocks this, what we're trying to go after, but so, so, so much more because basically everything is material. So that's really what Jensen's talking about. And I think that the reason that we get excited about, you know, being in discussion with them some of their teams and, and I think vice versa is because I think we see things the very same way everything starts with materials.

So I'll just, just jumping in. So, you know, we are in front of multi billion dollar corporations and we're in front of small studios with a couple of people. They all have a similar challenge, right? They, they are, they have this legacy infrastructure that they've kind of either inherited or they've built and it's become cumbersome and bloated over time.

And again, it's, it's, I think they realize that digital product creation is the way of the future. It's like, how, how do you, how do you just stop what you're doing and then adopt it? So the TMAC. Is that is that mechanism again, you [00:21:00] still do need some infrastructure to take those materials and to make a shirt or a sofa or whatever it is.

And Stephen will talk, I think, maybe a little bit later about a future. We're right out the box there. There you get the product immediately. But you know, especially coming out of coven for me you know, you're at home and you're shopping, you're not going anywhere. And then when I met Stephen, he's painting this picture of this.

Hyper personalized one to one consumer experience where everything was designed for a customer in real time and that really resonated and I think that's what other big brands are seeing now is how do we, how do we shift our entire, our entire infrastructure and systems to create products? To be able to deliver on that, on that future promise.

And it does start with materials and it does start with this team app and the team app is, you know, it's been created across academia and some research institutions and really smart people on the engineering side. And really, that is kind of like the, that is the 1st entry point into this future of digital product [00:22:00] creation.

And again. It could be for a two person shop that's just doing one specific type of product, or a multi-billion dollar shop that has, you know, a suite of, of products that, that they're offering their clients. I know you've had some, some unusual meetings with government agencies as well who are thinking about some of the same principles, which is single source of truth material, DNA.

And they can model different outcomes based on that. So I find the fascinating. I want to ask Stephen about this idea for making products digitally. How are you thinking about verifying ownership? Not just of the digital manifestations, but also of the digital twins, right? There's a digital twin of my shirt.

And can you talk about how that data ends up being the digital fingerprint for those twins? [00:23:00] Yeah, so this all kind of goes hand in hand. So again, when, when you start to talk about what this, this kind of future landscape it's probably going to look like and it will definitely look like the model kind of flips the ownership model flips a bit.

Right. Where you really, it really starts at the source of the manufacturer. So if a manufacturer, like a mill, for instance is digitizing all the materials, and then basically anybody can leverage that asset to create really any product with, with that material. Well then how does the manufacturer, I guess, ultimately they, they will manufacture the textile, for instance, in this case.

But how does that ownership model look? I guess. I think that, you know, whereas we have brands today and again, I'm always say I'm the biggest brand guy. I love brands, but it's really hard to think that the model for brands. It's going to remain the same down the road because it basically anybody can create any product at any time in a [00:24:00] computer.

And it's entirely democratized in that, in that sense. Then it's just hard to think that the ownership model remains. So, so basically you'll have an asset. I think that basically starts and is created by in some cases the, the mill. So they scan a material, they possibly tokenize that, that basically that asset.

And they're able to follow it along the entire chain and anybody who wants to put that asset or you know, use that material on any one of their products. Well, then they can do so. And then basically if If the brand is, or if that person basically now creates that product in the real world, then some sort of royalty exchange or some sort of royalty models happening.

So everybody's kind of benefiting or, or let's even take like a, like a brand, for instance. So like you have the, the, like an Hermes Birkin bag. They own that kind of that IP. There was that, that there was that that there was litigation over that in the digital context. Someone tried to sell [00:25:00] work and it wasn't them.

Exactly. So, so we're, we're seeing that, you know, that probably is going to be the model going forward where it's certain people have IP around different type products. They own that kind of that digital idea, that digital asset and then anybody's. Maybe not Hermes, but anybody kind of is able to use that asset to some degree, but then the brand or whoever it is should be getting some sort of kickbacks or royalties because.

They were the one to introduce it to the world. So I think that right now it's, it's pretty archaic in the sense that that ownership model is very, it's just, it's very, yeah, it's very stale. And I think that the future kind of like, when you have all these digital assets flying around, the ownership model has to change.

And let's not say, but I think we'll probably get into it in a second, but this, this idea of where, what digitization allows for, and where we're really focused on in the future. That will, that even transforms even more in terms of the ownership as a [00:26:00] component. I do want to talk about ownership in just a minute, but before we do, I want to talk about talent.

John mentioned there's a lot of very smart people at ModeMaison Labs working on this. So how did you identify? How did you recruit? How did you put together this core team? A very smart people. Where are they? I mentioned that the CTO Yakov is in Europe. You're in Texas. Tell us a little bit about the, the crew that's on the field.

Yeah, so that's probably the thing that I, that's the thing that I think probably the most pride in is, is, is is being able to get people that otherwise. Would be very difficult to do. And so the, the, the, the, the skinny of it is that, that I'll write these love letters basically to people that we have no business writing love letters to.

You write love letters. I'll write love letters. Yeah. I've ever like persistence, [00:27:00] persistence, where they probably are telling the FBI let's put this kid in a watch list basically. Because he's sending me way too many, and he knows way too much about me and everything about me. So I'm probably on a bunch of blacklist.

Actually, we know we are at least on one blacklist. But anyway, that said you know, we're super passionate about what we're doing. We're, we're very bullish about kind of this future. And so we will go to the ends of the earth to just get the best and the brightest across disciplines. On to the same team.

So you will go and study the best academics. You will go and study the best researchers, scientists, and you will pursue them. I won't say you'll stalk them, but you'll certainly get a lot of info. Yeah, right. And your love letters is, is the entree and it has worked on a number of occasions. People have actually responded and said.

I think, I think, you know, I was not a recipient of one of these [00:28:00] blow letters. So maybe I need a mulligan there, but. One thing I will say is that, you know, it is the talent begets talent. We all know that, you know, you start to get these really interesting people who are looking at huge tech companies who kind of can imagine some sort of trajectory.

But what we have uniquely, obviously, is this talent that we're talking about, but we have the data. And that was always what was striking to me is that We have what other folks can't buy because there is no use case to get this data for them. And that is also crucial because in academia, you know, we've, Steve and I have talked about this a lot.

And us too, Dan, is that a lot of the big breakthroughs are not happening in academia anymore. There's a brain drain. So what's happened historically is all the, you know, the, the deepest of tech is being implemented or at least started within the world of academia. Yeah. But now, because of basically everything being entirely predicated on data, or in a lot of cases, it's, [00:29:00] Yeah, it's almost entirely predicated on data, especially with these AI models, you need some sort of mechanism or system in place to be able to collect or to generate that data.

And so there's just no mechanism within academia. There's no economic incentive, basically to be able to create that data. And so you have it's coming from industry now. And so you're seeing this huge brain drain. You have been for the past 10 years. Of the top scientists from across the world of academia coming into the private institutions.

And so we're able to leverage that uniquely. Like John was saying that the other thing is that the data sets like you're saying it's money can't buy it. So it's not it's not a money thing. You also still have to have the mechanism. And so, because we're unique position with some of the biggest manufacturers, putting these team acts on factory floors, collecting the data uniquely.

We have these data sets to be able to unlock this. This, this new feature, what we call a mode 1. 0, this foundation model, the spatial intelligence model. And so if you spend your career, you know, getting your masters and your, and your, and your doctorate and your once you come across these data sets. [00:30:00] It is very enticing to come over to private sector where we're building, we have been building and assembling these teams that can actually leverage the data sets today.

So in his love letter that there's a little, there's a little chimichurri sauce that he's putting there saying, we happen to already have uniquely qualified team members, but we'd love for you to join this. We have proprietary data. I hear you talking about. I want to pick up on diving into the AI component of this a little bit more.

You mentioned a la mode. 1.0 makes me think of chat. GBT at 1.0. No one was talking about chat, GBT when it was in 1.0 mode, but 3.5 when everybody started talking about it. Okay, so we have these LLMs, right? Large language models that are tackling text. We have diffusion models that incorporate sight and sounds mode.

Maison is [00:31:00] building world models. That's what you started today telling us about. Share a little bit more about how those world models will fit in to this this quilt of different AI models that are already being constructed that people are already using. What do world models actually do for those that have never heard about it?

And, and why are going to be, why are they so important? And what is the special chimichurri sauce that you guys have when it comes to building? Those world models are so load a question. I can talk about this for 5 years, probably, but I'll try to be concise. So. If we take a step back again, like you're saying, we have LLMs that's been around for a while.

But that really is like, it's, it's, it's based on 2D kind of like 2D text, text data, language data. And it's entirely subjective data. And so we're basically at the point, you know, I think it was New York Times article came out [00:32:00] not too long ago about basically we scraped all the data in the world.

There's only so much we can go. So basically that is kind of true, but, you know, you have different architectures, model architectures that, that will kind of like keep us going forward. Thank you. But there's, there's only so much you can do with that. Now we're going to look back, I think, in a couple of years and say, okay, that was a little kid brother.

That was you know, that wasn't even first inning when it comes to AI what we can do with AI. Because what we're talking here is actually incorporating physics in our physical world. And taking it way, way, way further. So it's not just subjective kind of language models and be able to understand from that, but actually building multimodal foundation models that are based in space in our three dimensional space.

And that are based in real world physics and why that's important. Exactly our spatial constraints are spatial data. Why it's important for us, at least and I know it's, it's kind of like a, it's a long way of trying to get here, but it's a critical way. For us, it's important because if we can basically build this large [00:33:00] world model LWM, everybody has a different word for it.

Some people call it spatial model. Some people call it large world models. We actually call it unified world models or unified models, unified world model. Okay. Yeah, we're kind of there. So, yeah. But I think the UWM is more, it's more on the nose for what we're doing compared to what other people are doing, because they're really just trying to incorporate the spatial dimension as well.

And what I mean, space, I mean three dimensional space, geometry. So you have an image, which is 2D is flat, and then we all live in the 3D world, but then exactly. But then we also live in a physics based world of physically based world. So there's another component to that. And so we are basing everything on, on, on physical data and also spatial data.

So it's important because if we're talking about a system or a mechanism, or I guess this, this model, it can effectively create things that ultimately need to create in the real world. Because, again, we live in the real world. We're very real about that. Then you need a model that understands the real world to some [00:34:00] degree.

Right. So it's really important. One, let me just, I guess, back up real quick, like one highlighting factor, one kind of thing that, that illuminates kind of what I'm talking about is you, let's think about mid journey. For instance, mid journey is a, is a fusion model. It basically, basically, for lack of a better term, mashes 2D pics together.

That's why you have, in some cases you know, they used to be where it, They can never generate it. Just a regular hand. It would be like the seven fingers. And if you step back and ask yourself, well, why was that? Was there a bunch of training data on a bunch of seven finger dudes? Basically, that was his as inputs.

Probably not. It was because that's, that's basically the assumption that it understands a digit or a finger from a hand. It doesn't, it has no intelligence. It doesn't understand anything. It doesn't have any 3d understanding. So in order to, you might end up with a mid journey month, someone sitting. On top of a chair, but actually the leg is going the wrong direction that it doesn't have that spatial awareness that I hear you describing.

Exactly, and the sore [00:35:00] model that if it ever, ever comes out, who knows at this point, it's been, you know, the year, probably now open eyes, you know, video model that. They cherry pick, not to go too deep, but they cherry pick some of the better videos probably. And then some of these videos even have like dogs forming out of other dogs, if you really look at it slowly.

So it's like that stuff, it's well, that's probably not ever going to happen in the real world. So if we're trying to incorporate like manufacturers and enable manufacturers and brands and commercial type partners to be able to leverage AI, well, then it needs to be one that's physically based. And so.

One thing I'll to give a very specific example why it's important for us and there's a whole lot of downstream benefits in terms of like robotics and a lot of other things. But what we're really focused on is for brands. And so if a brand basically has a physical manifestation of their product or digital manifestation, sorry, of a physical product.

So a 3D model, yeah. And then if they have the materials digitized via the TMAC, then what if they could basically take those and drag and drop it into [00:36:00] some sort of interface, like a mode interface and say, hey, put this chair, this beautiful X, Y, Z chair in a latecomer palazzo and put this lady sitting on it with their legs crossed and within a matter of seconds, it's done and then, hey, actually change shirts on that lady with my shirt.

That I want to give it. That's done. And then also, let's take it one step further. So it's not only that they're working with the tools, the brands themselves, that's where the elimination of overhead comes in. 80 percent of overhead is just eliminated overnight based on that capability alone. But then you take it one step further, and what if we could actually embed that capability so that generative capability generate a product in any world.

Into any website that has a video or an image in it currently. So just like YouTube has their embedded player, what if you could basically take Allah mode and have embed mode? So that's what we're working on right now is to be able to use user preference and user data and user prompts. Basically where Dan, we know the past seven purchases from your Google analytics [00:37:00] data, the past seven times out of the 10 times you purchased something, there's been something red in the image.

You don't know that, but we can figure it out through Google analytics. Thanks. Well, you, you know, we'll probably generate, you know, every time you go onto that site, we'll generate you images with, with things that are red in it. So a brand. Red Thorlo socks. I've never purchased a red Thorlo sock, but I stand by the product.

I'm willing to try it. You made me think. It's a zero fail mission. That's what we say internally. It's a zero fail mission for retail. We're not creating, you know, puppies playing in snow. We have to sell a product at the end of the day. It is a absolute non starter. It has to be real. It does not look exactly like it does.

It kind of look like it, you have to make it where there's, there's puppies. I want to tell you that on my flight last night, I felt that there was a dog, the size of this animal definitely defied logic because when spatial intelligence, because when I walked to the bathroom, you [00:38:00] saw the dog spanning three rows and it's just not, why is the tail back here in 19, I saw the head in 17.

But I won't, I won't label my JetBlue experience to use on the road map. We'll fix that next. Can we, speaking of road map, we talked a little bit about AI. Let's talk finally about the Gen AI, Gen AI platforms, right? Because embedded in what I hear you describing is this future. That's highly micro personalized that is going to be specific to Dan, and there's going to be generating images and content, which I'm going to be most receptive to.

Today, I often get stuff in the mail and I certainly get hit with targeted advertisements that have absolutely no appeal to me whatsoever. You're describing a future where based on my past behavior, based [00:39:00] on what brands will begin to discern about me, based on what information. Ideally, I am willing to disclose in the universe we envision at Metaversal that I start to own more of my digital data and can decide selectively who and when to disclose that data to get away from the targeted advertising models that have built Google Facebook.

But let's talk about Gen AI platforms for commerce and retail specifically, because these have shown their merit with regard to ideation and inspiration. But it's unclear today. That they're commercially viable. What say you? So I guess there's two audiences we can speak to. So, so on one hand, you have the creatives and the kind of the storytellers.

And then on the other hand, you have the, the kind of the CEOs and C suite business type of executives. We uniquely speak to both of them. So when it comes from a [00:40:00] creativity perspective what we tell, what I tell my friends in, you know, like my fashion friends it's, it's, Creativity is enabled here.

So what we're enabling you to do is create that world of that movie to kind of tie it back to the very beginning of what we talked about. You create these worlds, these movies, you create these parameters. Essentially. I want to create this beautiful you know, Western Italian kind of world, well, then you create it.

You, you basically give, and what I mean by creating specifically is like by giving you know creating these mini models on top of this foundation model via images, via text, via video, and kind of explaining what that world looks like. And then basically you create these roles, but then you enable through embed mode, for instance, where basically it generates their products in any world imaginable.

It generates based on the user. And so we're in, we're enabling via embed mode. Users to be immersed fully into these movies on a hyper [00:41:00] personalized level. So brands and creatives will set the parameters and set the dream for these kinds of worlds they want to create. And then as a consumer, I go into a website.

Every time I go into a website it knows me better than I know me. Just because Google knows everybody, everything. And so it's able to use that to then incorporate you into these stories. So if I know Dan, you're pretty tall, you're at six, two. White male out of, you know, Brooklyn or whatever. And I know you have X amount of kids or whatever.

Well, then it can start generating you and your kids and kind of really show you like, you know, it'll show more of you in these kinds of images or in these scenes. And it really just kind of be a more high personalized experience for you to really kind of like fall in love with these products. But then when I talk to the business side of side of things What this enables again is it really enables you to just optimize heavily and, and not have to just it eliminates a ton of, of waste [00:42:00] and waste being time spent, but it also meaning physical waste.

So it's really, it's really great across the board. I know there's going to be a lot of people that kind of get kind of caught up in the, oh, my gosh well, it's taking away designers taking away productions. Well, it's not really, because you're actually. Enabling more creativity. That's the whole reason we went about this.

It was not about the elimination of, of I guess, waste or sampling. It's a great product, honestly, but I mean, by product, but. It is really something for everybody. We're going to be talking here at Metaversal in the near future about how we leverage AI, not to displace jobs, but actually to transition, transform and upskill people to take advantage to leverage AI to unlock that creativity that you're describing.

So. I think you're precisely right. I want to wrap us up with a final question. John should go first. And then Steven, I want to hear your answer, which is if [00:43:00] we go in the way back machine, what was the 1st thing that you ever owned? I think, you know, I, I think about collectibles that I went to a store with my own money to buy, and that was probably in the form of baseball cards, you know, the ones with the tops with the gum in them you know, got the pack, look through it, you put it in the.

In the folder silver, the sleeve, you felt like that was yours, you know, put in a special place, refer to it, you know, try to build that collection. I need to get this one to get that one. Probably that because it was a function of both having used my own mind to get something and then caring for it, which I think is part of ownership.

A lot of lawns were mowed in order to get it myself with all those references now, but yeah, unfortunately 1980s was a high production time for tops. And all those other card manufacturers. So we [00:44:00] know we rest assured that all of your cards are worthless, but we love the effort. Steven, I have very different idea of ownership.

So there's three things. So one was like this leather jacket. It was actually a Ralph Lauren jacket. I still have it. I think one that I like, I like saved up for and got it. Another one was. We're talking about just owning things, right? Like just, so another one was, I used to be a DJ in a past life in middle school, actually me and one of my friends that is a year older started a DJ company.

And by the time we were in middle school, like there was literally a time where we DJ at a wedding. I was like, I called him before. I was like, I don't, I don't think, you know, we're like 12 years old. This is, this is kind of embarrassing. Anyway, we did it. And then, so we bought a lot of equipment.

That was fun. That was the first time I like owned something. And then the third one is this, this is so random, but y'all remember the, the, the little guy from orbit [00:45:00] the Astros mascot. I used to have this thing that I owned and I, and I, it's a Houston reference. Houston reference. So those are like the only three things I remember when I was little.

Oh, cause the Astros. Okay. How the Astros doing? Because we know This year it's Yankees Dodgers. I haven't seen the Astros in a while. Unfortunately, I just made this reference. I'm not a sports guy, but yeah, I don't know actually. Very interesting. You see, John, there actually could be some reason behind the rhyme there.

He was jaded. By that acquisition, that early ownership experience, like the hell with these Astros good point, not going to a baseball game. Again. Guys, I remain super excited about what you are all building at ModeMaison Labs. It's a vision for the future, and we know the future is happening much, much faster than most of us can comprehend.

But I'm very [00:46:00] optimistic. I think we all are at Metaversal. Very optimistic that you guys are charting this unique course. I want to thank everyone for joining us on today's pod to our listeners and watchers. Thank you for tuning in know that we have some more amazing guests lined up in the near future.

And so until next time, thanks. Thanks, guys.

The Metaversalist
Don’t miss a step or drop.
Thank you! Your submission has been received!
Something went wrong while submitting the form - please try again
Re:Imagine Ownership Podcast

Episode 01

In this inaugural episode of the Reimagine Ownership Podcast, hosts Yossi Hasson and Dan Schmerin explore the transformative potential of blockchain and Web3 technologies in reshaping digital ownership.
Metaversal
February 20, 2025
Re:Imagine Ownership Podcast

Episode 02

In this thought-provoking episode of Reimagine Ownership, hosts Yossi Hasson and Dan Schmerin dive deep into the world of blockchain, NFTs, and digital ownership with special guest Tor Bair, CEO and co-founder of Stashh.
Metaversal
February 20, 2025
Re:Imagine Ownership Podcast

Episode 03

In this exciting episode of the Re:iImagine Ownership Podcast, we sat down with Oliver Quie, the innovative co-founder of Innerworks.
Metaversal
February 20, 2025
Re:Imagine Ownership Podcast

Episode 04

In this Episode, Dan sits down with Director Jack Addis and CEO Gillian Varney to discuss their impact on the digital artist space, their contribution and impact on the world, where they see digital ownership going in the digital age - and how artists can take part in their 2025 Lumen Prize.
Metaversal
February 26, 2025
Re:Imagine Ownership Podcast

Episode 05

The second instalment of our two-part conversation with Lumen Studios.
Metaversal
February 26, 2025
Re:Imagine Ownership Podcast

Episode 06

In this episode, Dan welcomes Jerome Faury from Immersve, a portfolio company, to discuss his extensive experience in tech and payments, spanning from the late 90s to the revolutionary Web3 landscape.
Metaversal
February 26, 2025
Re:Imagine Ownership Podcast

Episode 07

In this episode of Re:imagine Ownership, we sit down with Connor Moore, Co-founder and COO of MetaStreet, to explore the future of digital asset lending. MetaStreet is at the forefront of the NFT-backed loan market, enabling traders to unlock liquidity from their digital assets.
Metaversal
February 26, 2025
Re:Imagine Ownership Podcast

Episode 08

From NFTs and meme coins to decentralized AI and tokenized assets, the world of Web3 is evolving fast. In this episode of Reimagine Ownership, Dan sits down with Steve McKeon, co-founder of Collab Currency, to break down the future of digital ownership.
Metaversal
February 26, 2025
Re:Imagine Ownership Podcast

Episode 09

Today, Dan dials into the Time to Be Happy gallery. as Dan sits down with the incredible Dr. Sian Proctor. Calling her impressive would be an understatement! She made history as the first Black woman to pilot a spacecraft, but that’s just the beginning—she’s also an acclaimed Afrofuturist artist, poet, and advocate for a more inclusive space future. Passionate about NFTs and digital ownership, she even took an NFT to space during the historic Inspiration4 mission.
Metaversal
February 26, 2025