The Dead Pixels Society podcast

The business case for automatic image correction with EyeQ

November 14, 2020 Gary Pageau Season 1 Episode 26
The Dead Pixels Society podcast
The business case for automatic image correction with EyeQ
The Dead Pixels Society podcast +
Get a shoutout in an upcoming episode!
Starting at $3/month
Support
Show Notes Transcript

Gary Pageau of the Dead Pixels Society talks with Brad Malcolm, CEO, and Jeff Stephens, CTO, of EyeQ, the makers of the Perfectly Clear suite of intelligent image-correction products. Malcolm and Stephens talk about the updates to their new iAuto 2021 update, the challenges of correcting for skin tones for a worldwide customer base, and the business case of automatic image correction. 

EyeQ recently launched iAuto 2021, offering a variety of enhancements including sharpening, depth, skin tone preservation, and contrast. 

Energize your sales with Shareme.chat, the proven texting platform. 

ShareMe.Chat 
ShareMe.Chat platform uses chat-to-text on your website to keep your customers connected and buying!

Mediaclip
Mediaclip strives to continuously enhance the user experience while dramatically increasing revenue.

Buzzsprout - Let's get your podcast launched!
Start for FREE

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Support the Show.

Sign up for the Dead Pixels Society newsletter at http://bit.ly/DeadPixelsSignUp.

Contact us at gary@thedeadpixelssociety.com

Visit our LinkedIn group, Photo/Digital Imaging Network, and Facebook group, The Dead Pixels Society.

Leave a review on Apple and Podchaser.

Are you interested in being a guest? Click here for details.

Hosted and produced by Gary Pageau
Edited by Olivia Pageau
Announcer: Erin Manning


Gary Pageau  0:03  
Hello again, and welcome to the Dead Pixels Society podcast. I'm your host, Gary Pageau. And today we're joined by two people. We're joined by Brad Malcolm,  CEO of EyeQ and Jeff Stevens, the CTO of EyeQ. Good afternoon, gentlemen, and welcome to the show.

Brad Malcolm  0:21  
Thanks, Gary. Thanks for having us. Okay,

Gary Pageau  0:24  
Brad, you've been with the company since its founding in the early 2000s. Can you kind of give us some history and IQ, where it started? And how it got to be where it is now?

Brad Malcolm  0:37  
certainly can. And I know you said to keep the podcast short, so I won't get into too much detail. But yeah, we started in actually back in 2001. With the premise of how does one automatically correct images, but an accurate correction. So we're all about accuracy. Or our inventor, went to Europe photograph stained glass windows came back, and they were dark, and really tried to crack them in Photoshop. Not only was it tedious, but also colors were shifted, and that was wrong. So that's that was the genesis for our company, which has certainly evolved. We've grown from innocence, nobody to somebody, and that was all bootstrapped and done that organically. I did a restructure a year and a half ago. And that's where I changed name to EyeQ Imaging as part of that process. Hence, the name change from Athentech to IQ. But the technology's Perfectly Clear, which has always been around since the beginning. And that's where we leverage our automatic image correction that we got unique patents on.

Gary Pageau  1:35  
So perfectly clear is the name that most people know your company by. You talk a little bit about the various platforms that perfectly clear is in because I don't think people realize that it's basically everywhere.

Brad Malcolm  1:51  
Yeah, well, that's a good point. So just overview, Perfectly Clear is intelligent image correction, we licensed out to, excuse me, we licensed that technology to companies around the world. And yes, it can be incorporated. So that's what we do, we're going to be incorporated literally anywhere, we are used by some of the largest cell phone companies where we're embedded directly in their camera pipeline. So we were run natively on mobile. And we have our engines, which is the core of what we do. So think of us as to take our engine and license it in to your car. Therefore, we can go into servers, we got a server SDK, we can go into desktop software, we can go into your mobile apps, we got native mobile SDKs, as well. So the engine can go anywhere. Last year, we launched a web SDK that runs directly in one's web browser. So it's easy to do front end implementation for once customers to play with it. And then we also have software for those people that want to double click and install it and just run the software. And we also Thirdly, the second or last area would be the website of things where we have a web API, so you can sign up for an account, send it to our AWS servers reprocessed and send it back to you. So we literally are everywhere, which is one of the reasons why we got so many people using us because there's a solution for wherever you need it. And it's easy and effortless to implement.

Gary Pageau  3:14  
Now, Jeff, is this is it one platform, one core set of technologies that's implemented across various services and devices? Or is it tuned for each individual? situation?

Jeff Stephens  3:28  
Yeah, it's it's one set of source code. It's one set of technology and algorithms that is compiled and bundled into all of those various different things. So regardless, if you're using our mobile platform, or the web based SDK, the desktop or server SDK, the image quality coming out, it's all identical. Sometimes there's tiny differences in like JPEG, compression, RAM, and the vast majority of our platforms. It's pixel for pixel identical across all of our different offerings.

Gary Pageau  4:01  
And users like in a lab environment, they can tune the results to to what they are expecting to produce.

Jeff Stephens  4:11  
Yeah, that's exactly right. We have we ship with with presets and we build presets, you know, with with our customers over the the decades that we've been doing this. So we have tuned very good default parameters that are used some of our customers both labs and you know, our licensing embedded customers use those defaults.  But yeah, a lot tune them for their specific output devices or types of photography. You know, the indoor wedding shots versus the outdoor wedding shots, whatever it might be, the more you tune, and the more you customize, you can often get a better correction. But we also have great looking results for just, you know, any any photo that comes through by just using our defaults.

Brad Malcolm  4:53  
So because what's important to remember is we have a wide breadth of people using us our iAuto 2020 One which we just launched, and our prior iAutos, they worked great out of the box. But remember, we got studios and professional photography, that is slightly different than consumer use cases were generally a bit popular and a bit brighter image for the consumer, the pro images are coming in at a different level. And then we also have customers in every country around the world basically using us. So people in Asia have a different preference versus European versus North America, hence, hence the need for even though we're automatic, it's tweaking those automatic parameters to achieve an individual end user result.

Gary Pageau  5:39  
So let's talk a little bit about iAuto 2021, which has just been launched. You just mentioned it. And it's a the next generation or the next, the new generation of the platform. What is the big enhancement here?

Brad Malcolm  6:00  
Jeff, do you want to talk about Saturday? Want me to talk about that?

Jeff Stephens  6:03  
Good. Go for it. Right, take it away.

Brad Malcolm  6:07  
iAuto 2021 because everyone's looking forward to 2021. And they're launching at the end of the year. So that's hence the name of it. But yeah, we're continually an event innovating and improving. That's what we do take our patented technology, how do we make it better? And what are the needs of our customers. So what we specifically included on this one based upon feedback is one or a new approach to sharpening so images are much crisper. And we spent a lot of time optimizing this for prints as well, because printing screen is different. So you're going to get crisper sharper images. Okay. Secondly, we've added in what we call our super contrast, which is really important, again, for print and on screen, more depth in the photo images really pop, they're sharper, but there's a lot more pop to them as well, hence why we call it super contrast. Whether it be with faces or whether it be with landscapes, more depth, and background foreground a picture's worth 1000 words. So you really have to see it, to actually see what it does, right. The third thing is what we call skin tone optimization. And there's multiple facets to this, but one was one is, skin tones often come out printed, or on screen to read multiple reasons for that some printers, good printers out there. But as they get more and more wider color spectrum, then the downfall is the skin tones are to read that they print. Also cameras capture infrared light butter, human eye doesn't see it. And then that's another reason why skin tones are often you know, to read or to orange. And so having the ability to both tweak that, but automatically building that in because we got North American, we got African American, got people from all the different places. So for me from your ability to tune that to the skin tone aspect is also key one, we also Jeff and his team has built in AI clipart detection, if it's not a real photo, detect it and skip it.

Gary Pageau  7:59  
I thought that was actually kind of cool, because when you get the integration of stickers and clipart with some of the images that people are capturing these days. That is probably a big challenge when you got, you know, Pikachu in the corner, and you've got the main subject. And that's pretty impressive.

Brad Malcolm  8:20  
Yeah, and what we're asking for in today's world is just automatically save me time, give me the best images as possible. And there's a lot of different things thrown into that. So we need to be able to do an awkward correction but only do it when necessary and and know what type of image is needed and what don't. So that's that's all part of becoming the smartest and continuing to be the smartest that. Okay, that AI in that in reality when can be?

Gary Pageau  8:49  
So does does North America present a more greater challenge in terms of skin tones because of the diverse population? And what kind of regional differences do you see around the world? Jeff, we had talked earlier about you've actually got an Asian version of iAuto product.

Jeff Stephens  9:11  
Yeah, so there are two aspects of that, of course, there's the the different range of skin tones that people have. So in different geographies, different populations, you know, with with different depths in color of skin, so that the people are actually different. But then preferences are as well. So what we try to do is have the best looking correction by default, but then also allow our customers in different regions to quickly swap out, you know, presets or algorithm tuning for regional preferences. So the people might be the same in different geographies, but but preferences can can vary from place to place. So you have to be flexible for both things, both the people in the photos and the people looking at the photos.

Gary Pageau  10:01  
So is this something that your customers around the world can control? Right?

Jeff Stephens  10:07  
Yeah, exactly. And we've spent a lot of time tuning that with our big licensing customers and our, you know, lab and printer customers as well make sure they get the best result for the for the viewers that will be seeing the end product.



Gary Pageau  0:02  
So, EyeQ has really been known for being a pioneer in the image correction business and everyone, I think I think everyone can agree, you know, great looking pictures or something that everyone wants. But what's the business case for improving your pictures? Is it just to prove customer satisfaction? Are there tangible rewards to the bottom line for having better pictures?

Brad Malcolm  0:27  
Oh, well, Gary, there's lots of tangible rewards. And it's not just about better pictures, although that's certainly a key component of it. It depends on each customer, which is which is different. But to answer that question, but more granular one, we know we got studies from better image is going to help one help encourage more prints. One of our customers and go to our website, read it Norwegian Cruise Lines, where they increase their revenue by images look better, customers are going to be more likely to print and they're going to choose more. So whether they choose an order or not choose an order or make a large order. That's a way to monetize also on canvases or an item that we're we see a large dollar value per image. We've seen people and have this discussion with executives, where if it doesn't look ideal on screen, when it's less likely to order. I've also been at Costco kiosks where this is a good you know, real story. The couple next to me was was ordering and the wife wanted 50 pictures of her child and the man was the husband was saying I don't think we should because their faces dark we can't see or why should we print this. And if they don't know that, that can be looked better than they're not going to print it so people are losing out. So helping to print more to increase revenue is certainly a key and important tangible benefit that we've been able to accomplish. Another thing, and we're seeing this a lot more in impro Labs over time with COVID. But even non pro labs just saving time, we know images, people expect the best expect labs, printers to just do a better job. But doing that manually is expensive. And it's also not the best value add. So doing that automatic is a time saving. Saving Time saves money. So that's why there's a big aspect to that, especially with some automation and theme parks. And as people get worried about COVID and producing touch points. Automating more and more of that process is we're seeing more of that come to more of a forefront. And there's also the benefit of increasing customer satisfaction. Obviously reprints mean, there's somebody prints the order, if you got to reprint it, there's a cost, but there's also the intangible cost of my photo came out looking bad. It's one thing if a photo looks dark, or is blurry or not sharp or not as vibrant. But it's another thing if it's the whole photo book is that cost to reprint is more expensive, and just want the customer

Gary Pageau  2:56  
what are the things that we use to find back in the days of, you know, doing marketing research for the industry is consumers are pretty hard on themselves when they when they consider their own photographic expertise. And in most cases, if a picture was substandard, they often blame themselves that they did something wrong. Right? So it was all it was always a disincentive to be involved with tiger, especially back in the film days, right? Because and you didn't really because back then there wasn't the amount of correction that's happening right now automatically. So I think there there is like, a valuable lesson there. And then you know, if you've just increased customer satisfaction, you're going to get more of that behavior. Right?

Brad Malcolm  3:46  
Absolutely. And also putting us on the front end to while the customer is important, because now you're giving the customer like wowing them with options, whether it be creative filters, whether it be perfect selfie, whether it be perfect landscape, for sure.

Gary Pageau  4:01  
So we've touched on something about better pictures, but there's sort of an interesting dynamic there between realistic pictures like what things that accurately capture the scene, and what a consumer may think is a better picture, right? Because you've got all these filters and enhancements and whatnot that are kind of the what's in vogue now. Right? There's HDR and all these other filters and enhancements you can put to a photo. And so Jeff, how do you balance that, you know, realistic versus better?

Jeff Stephens  4:40  
Yeah, it's a good question. And obviously, whatever the customer likes most is the right answer. But what we try to do is provide corrections that look as realistic as possible. And and that in itself is a challenge. Obviously what you see with Your eye when you're out walking around, you're on vacation, you know standing on the beach, that is reality your eyes perceiving the scene as it is. But if you snap it with your, you know, cell phone, there's processing that's going on automatically. So the first image you see on it is already highly processed. So people are getting tuned to think that real is the first digital version that they saw, instead of the the first mental image that they saw, which might have HDR or lots of contrast, lots of saturation. So there is a trend sort of industry wide, I believe, of higher contrast, more saturation a little more pop. Even in images that aren't, you know, advertised as being retouched just like straight out of the box. Right, it'd be more more vivid, more pop, more more impact. And then we also have, you know, the tools to do the the creative effects on top of that, sure. It's always, you know, one of those things where our customers, and you know, have to strike that balance, or make it an option for their customers. So they can opt in for the, you know, HDR effect, even more boosted saturation, or contrast, those, those kinds of things

Brad Malcolm  6:19  
One thing I'd like to add in support and joint because people to ask that, and there's a lot of confusion on that. We were, we've, our patents dictate this. And it's been at our heart from the beginning. And that's the reason why we have a trademark called real color photography. So we do image correction. And that's why we've always said correction versus enhancement. There's camera limitations. That's why an image is noisier, twice darker, twice, not as sharp as you remember the camera still, despite them getting better and better and more lenses, it still can't replicate the human eye. And so we've always been about a realistic image. That's why everyone else does enhancements. But we do real color photography, we're the only solution that can deliver that we got lots of science as to why but it replicates the way the human eye works. And we were the first one to do that. And we break those into different thing. So obviously, color of a leaf as you see with your eyes should be corrected to the same correct color, even if it's dark. Now we do have beautified technology. And beautify though, has been given a bad name by companies that do beautify that artificially smooth an image out and make up. And that's fine. And that can serve a purpose. There's some interesting AR stuff being done for makeup companies. But with our beautified we got the two modes, we cover our accurate beautify which smooths your or eliminates dark circles into your eyes and makes you look brighter, so you look more rested. Everyone wants to look at that, sure, WE CAN SLIM a face and we can whiten your teeth. And we can do those things as well. That's at the discretion of the user. Because people do want to look their best. And so we have a realistic, we give you the ability to look your best and may have your creativity. But everything we do at our core is about an accurate correction always has been and always will be. And that's again, something that's unique.

Gary Pageau  8:03  
Because I think that is one of the things you know, if you spent some time in the social media world, there is a crazy amount of correction that goes on where you know, they they take any reasonably attractive person and they turn them into something that is looks like a mask, you know, in some ways where it's overdone. And I'm not sure who that's serving. But it certainly is interesting.

Brad Malcolm  8:31  
Well, there's a creative you know, people like to be creative as well. Sure. And that's why there's three buckets we added in a couple of years ago we call them creative looks. That's when Photoshop introduced looks and so you see lots of other companies now adding meaning at different but creative look side. And we did that because again, adding a full package. customers do like to play they do like to have a retro look, they do like to have some you know different funky colors and then film stocks. So yeah, we got stuff that replicates old film stocks and all that old era so for those customers that want to license that build it in we've been told Yeah, they're better than Instagram filters. It's just part of the complete package pure and do like being creative. Millennials love playing with that. Let's move on with that. But that's different than that correction side to just, I don't want to look dark I want to see my daughter's face. I don't want to have red I want the sky to look bright. I want to remember that memory as I was when I was outside.

Gary Pageau  9:26  
Or you know as interesting. You know this has been a dynamic that people have been talking about for years. You know, as we talked before we started about you know, back in the day, Kodak was known as you know, their big thing was realistic color color as you saw it right that was sort of their Hallmark. And then when Fuji came into North America, they came in with their poppy colors they were they they were you know, very saturated greens and reds and they really took a big share of the market because people like the saturated colors, you Even though they will tell you, you know, customers will say, No, I want an accurate car, I want it to be just as it was. But then when you show them side by side, they almost always picked the saturating car. So it's one of those things where customers will tell you one thing and do another, that's for sure.

Jeff Stephens  10:14  
Well, and there's also interesting in the market, and Jeff alluded to that as well. More and more pictures, as we know, are being taken on cell phones and cell phones have a tendency to over saturate the image, make it extra bright, make it extra Poppy. So people get used to that look, and they're used to their images looking that way. Now, when they send those for print on a cry k additive model, they don't come back as bright or the same look that they're used to seeing digitally on their phones. And that's why we've been working on with many of our customers on narrowing that gap. Here's how you can set those expectations and make your images brighter and more colorful, to achieve that expectation. Hmm.

And the one other sort of avenue down that is, who took the photo and who's doing the correct thing. We have customers that are in photographers, so they are the the creative director, they're making all of the aesthetic choices themselves. But most of our customers are businesses that are correcting the images on behalf of the photographer, right. So for that use case, realistic, or it takes away some of the creative license. But the goal for the best looking photo is is the is the thing that crosses both of those use cases. So it's automatic correction, but you know, who's making the aesthetic judgment versus the the creative? You know, the ability to decide whether to apply one of those filters or not.

Gary Pageau  11:53  
So what is the path that IQ is on? Where do you take this technology next? Because as I think most people would agree, it's pretty darn good right now. I mean, we'll see what the reception to the AI Auto 2021 technology is, but I think you've already had some some people accepting it right away. So what is sort of the next generation or the next implementation we're gonna see of the accurate color correction technology?

Brad Malcolm  12:31  
Well, the high level is, is basically people expect, hey, I'm gonna toss everything, including the kitchen sink at you, and you toss it back to us without us having to do anything. So pro photo, consumer photo, indoor, outdoor wedding, you know, everything, just automatically remove all the manual innervate intervention and makes that do your best that you can always better, never worse, really aggressive, and sometimes, but not aggressive, and other times. So just continually to get smarter. So there's the high level. Jeff, I don't know if you want to talk a bit about some of the bit more specifics on what you and the team are working on as it comes to AI and non AI stuff.

Jeff Stephens  13:08  
Yeah, one of the things that we're looking at now is so we've been doing auto correction, where we look at an image and the only information we have is the pixels in that image. And so we take that image, and we make it as good as we can based on only that information. And now we're trying to get a little smarter and say, okay, where is this going to end up? Is it on? Yeah, an indigo press is an inkjet, is it on a screen? Is it a two inch tall print in a yearbook? Is it a 36 inch tall wall print? All of those influence what the you know, quote unquote, best output images? Yeah, you you consume a two inch image in a yearbook differently than you do wall art. So you probably have different desires about those image characteristics. So this is gonna be

Gary Pageau  14:07  
kind of a huge challenge, because I remember back in the day when, you know, x rays and books like that, you know, we're big on car managing monitors, right? So now you're kind of doing that. Not color management, per se, but you're, you're correcting color without any really reference to the monitor or the or the the screen quality that's actually being yours.

Jeff Stephens  14:34  
Exactly Right. Soft proofing was the, quote, easy part of the problem. And it's a very hard problem. The heart The so the thing that we're trying to tackle is not just the output gamut of the device that it's going to be printed on. But those you know, psycho perceptual. This is wall art, it should feel like something, or it's a yearbook photo, the most important characteristic of yearbook photo is that it looks similar in in density and tone color to the photos around it.

Gary Pageau  15:08  
Right?

Jeff Stephens  15:09  
So you're not optimizing the individual photo, you're optimizing the page of photos. So we're expanding the scope of what a good photo looks like, it's not just the best looking photo, but it's the best looking photo consumed in, you know, whatever means it's going to be consumed in on search

Gary Pageau  15:25  
It's almost like a contextual sort of consideration, right? Yeah, exactly in the context in which it's going to be seen

Jeff Stephens  15:33  
Yeah, and you know, how close to the imager you're going to be, which, you know, tells you how sharp or, you know, blur radius, if you're looking at it on a photo book, it's different than looking at Walmart, you're farther away, it's much bigger. So not just things like DPI matter, and output sizing, but also how you processed the image and some of those image corrections. So it's gaining knowledge about the output side, what what hardware, what's the intent? The the on screen versus on print is, is a huge problem, just like you alluded to, not every monitor is calibrated? I would say most aren't. And then with mobile devices, you know, now they have like, you know, the the the blue shift modes at night where it goes kind of Amber, a person, you know, browsing through a photo catalog, then it's going to have a very different experience than doing it, you know, outside of natural light. Right. So how do you preview an image accurately? Taking into consideration things like noise, sharpness, knowing how it's going to be output? Yeah, it's a, it's a big problem. And those are some of the things that we worked on for this release, and also going forward.

Brad Malcolm  16:52  
And Gary, I guess from, from a business side, a couple of things. I mean, one, we work closely with our customers listen to them. And that's how we develop the new things. So the iotawatt, 2021, we worked with based upon what our customers wanted to see. So although we've never raised $2 billion, like our kwibi company, we've been around a long time. And we're going to continue to be around a long time. And we were in touch with with that aspect. But people rely on us because we're we do image correction. Absolutely. We're the leaders in that field. But anything image related, we're here to support our customer needs. So better images, people rely on that for cost savings for consistent output, you know, I hope and increase revenue, but also other imaging needs. And you've seen several things being launched on that side on just imaging services, you got low images, you need them high res we support that, for our background balancing, which is unique to school yearbook, making backgrounds consistent, which is also a cost saving, but need in their face, cropping. So supporting that whole imaging ecosystem is certainly one area as we go down the path to continue to to be the leaders in that field and be able to support the needs of our customers and future customers. So obviously, in that photos, his videos is an interesting topic of conversation. Here genius, should we support video? Do you have a need to support video? Obviously, we'd love to hear from your listeners if what their thoughts are whether they have a need for that we're talking to people. I do believe that. And we've been told on the mobile side, as we see more rollout of five g that faster speeds, people will most likely be shooting more video with that. What does that mean? And videos an interesting topic?

Gary Pageau  18:48  
Yeah, and it's, you know, and if you don't have a some sort of monetization on the back end for that for the customer. Right. So how did you know how does how does a customer get the ROI? And then investment, right? I mean, I could certainly see if it's an advertising driven platform, like a tick tock or something like that. But for every every, you know, the somebody's shooting video on their phone of their kids birthday party. And there's no prints involved, right. So what is the monetization aspect of that? So maybe like you said, maybe we'll have some listeners chime in, send us an email or something to give us some input on that. And that would be that would be a great a great source of information, I think.

Brad Malcolm  19:36  
Yeah, we'd love to hear that. Because that's helpful as we formulate our path forward. And just knowing a lot more about the info as well. We get people that want to know that obviously, we do image analysis, in order to do our automatic correction. But there's a lot more down the road. That's as we apply our in AI models that Jeff and his team is building that we can be able to reveal to our customers more and more info about whether it be a landscape, whether it be a portrait, a wedding, a baby, etc. One can do when can enable that for like auto gift creation or othering needs.

Gary Pageau  20:18  
Yeah, cuz as you look at things like AI sort of encroaching into many areas of technology in terms of servicing the end user, I think it's it's very interesting to me when you consider things like, you know, when you're doing image enhancement, if not only are you looking at the pixels and the color and whatnot, but you're also looking perhaps at the context in which the pictures are taking that, Oh, this is a wedding. And this is a wedding dress. So maybe we'll do something a little different than we would if we just looked at a something that was just large and white in a picture.

Jeff Stephens  20:53  
Yeah, that's exactly right. That's, that's part of the gaining, gaining more knowledge of than just the pixels themselves. But what what was the, what was the intent of the photo? What's in it? What do you think the intent was? Yeah, the the wedding dresses is a great example. Oh, and then also 10 pictures of the same wedding dress, those all need to look similar, because they're all going to be printed on that album, or, you know, in the same, you know, collage on a wall. So it's just expanding that realm of data that we have available while we're processing the image. Is it? Is it a night scene with a wedding dress in it? And then how does that affect the the intent of the photographer, and how the photo is going to be used? Ultimately, we invent tools that help make our customers money. So until we know the business case behind? Yeah, like video, for example. There's not an obvious monetization, monetization path for a lot of our customers that are printers. So we're staying in that world of, yeah, how do we make the photos better by by learning more about them? And what we can use that net knowledge for?

Brad Malcolm  22:07  
I was just saying, we know there's a lot of discussion and good stuff happening, both by third companies and by companies internally for auto photo, book curation and other stuff. So you want to do that as smart and as efficient as possible. You mentioned white earlier, but obviously putting white snow, which we got a lot of that in Canada, right now, white sand, which is a lot of that in the Caribbean, those different pages, probably not next to each other versus a white in a wedding dress. So just being smarter about what's in an image, right, but it'll take anything new or new on that, but it's important to remember.

Gary Pageau  22:41  
Yeah, cuz I think that is one of the things I think is gonna be driving a lot of technology going forward is more information related to the context of whatever it is you're dealing with, whether it's, you know, a, like in Spotify, right, they know what songs you listen to. So there isn't just similar songs from other people. So they have more context, when you're shopping on Amazon. They know what you shopped in, they know what you click on. So they're going to suggest other things for you. So because they've looking at their have more information, so I think photography is going to go that way, I think where you're going to have auto curation of content, auto correction of content based on what's in the picture, as opposed to just looking at, you know, like you said, the the bare pixels, and, you know, the color gamut and all that cool stuff.

Jeff Stephens  23:30  
Yeah, and, and auto determining of, you know, aesthetic judgment, one customer printed these kinds of images versus these kinds of images, okay, they like more Poppy, high dynamic range, right? more saturated, or this customer's catalog looks, you know, more like rustic and you know, or you know, whatever it is, yeah, that's exactly right. Learn from not just the single image, but a whole corpus of images, which ones are successful, which ones are printed? How does that differ from the population at large? Yeah, there's a lot of neat things that we're looking at there.

Gary Pageau  24:05  
That's pretty cool. Actually, I hadn't really thought about that. But you could like, almost assess a photographer style. That's pretty cool

Jeff Stephens  24:18  
Yeah, there's a whole notion in the in the AI imaging world called style transfer. And that's not something that we're, you know, eagerly working on right now. But yeah, you can say, you know, here's, you know, answer Latin style, and then go apply that to a modern photograph, or something along those lines. But yeah, there's a lot of interesting things there about about understanding just more about the context, like you said, not just the individual photo, but the individual photo in a collection in, you know, a whole body of work over right, decades. And, and what do you do with that added knowledge?

Gary Pageau  24:57  
Well, I guess what I'm thinking of is I mean, you could even go to the Sandpoint or whatever You're established wedding photographer, you could actually look at what are your more successful images? Right? Let's say you're sure 100 weddings a year. And after a couple years you're going to have through an AI bot or a dashboard or something, actually, a representation of this style of photo sells better than others. Right?

Jeff Stephens  25:24  
Yeah. And that's, that's a super tricky thing to to actually determine. Because, yeah, the, the headshot of the bride is always going to sell and give you actually a B test, like a slightly different style and do that, you know, consistently. So it's a complicated thing to discern. Obviously, any kind of, you know, big data problem takes big data, it takes hundreds of millions of photos to that insight. But yeah, as as you know, more and more images go online, more and more consumption. You know, how do you how do you choose? What was the most popular is it what was purchased? How do you know what how it was reused after that point? It's, there's lots of interesting research that can that can happen down that path?

Gary Pageau  26:13  
Yeah. And I've been and even like, yeah, maybe popular may have been shared a lot. But for direct didn't make much money off of it.

Jeff Stephens  26:21  
Exactly. At the end of the day, what's profitable? Yeah, especially saving money. Are you making more money? And yeah.

Gary Pageau  26:29  
That's interesting. So if I was a potential customer of AI Auto 2021, where would I go for more information?

Brad Malcolm  26:41  
You can go to our website EyeQ.photos. And we have a landing landing pages on there that talks about iAuto, just reach out at sales retail through the website to contact us. sales@Eyeq.photos.com definitely respond. can even pick up the phone and call us. 

Gary Pageau  27:02  
What? That's crazy. It's very 20th century Brad. Come on

Brad Malcolm  27:07  
Yeah.

Gary Pageau  27:09  
Like, Well, guys, thank you very much for your time. I appreciate it. And I hope to touch base with you soon. And have a great week.

Brad Malcolm  27:19  
Well, likewise, Gary, and it's always a pleasure being part of what you do and keep up the good work on having a good conversation from the whole digital output film community. It's important what you do to keep it alive. So thanks for your efforts.

Gary Pageau  27:33  
Good talk to you too, Jeff.

Jeff Stephens  27:35  
Cheers. Thanks, Gary.


Podcasts we love