We're back for Q3 2024. Quick question of the day ... then onto the topics of the day.
Topic 1: MIT Releases the AI Risk Repository
Thanks to the Massachusetts Institute of Technology for a major resource. It's definitely more for you than your clients. But we see great opportunity here.
The public database contains 700+ risks as documented by published papers. It includes many categories, based on severity and other variables.
Check it out here: https://airisk.mit.edu/
You can poke around online and gets some good information. Or download the entire database in Excel format. You can then sort, color code, or do whatever you want with the data.
And that includes providing educationi and training for your clients
-- -- --
Topic 2: Shocker - Google Has a Monopoly on Search!
Okay, not really a shocker. But does it matter that a judege says so?
Does anyone care (other than other search engines)? Would breaking up Google change anything? What is most likely to actually happen (e.g., Microsoft-like consent decree)? The 1984 AT&T breakup into seven baby Bells may help us see 40 years into the future
As usual, we discuss.
Related Link: https://www.nytimes.com/2024/08/13/technology/google-monopoly-antitrust-justice-department.html
-- -- --
Topic 3: NIST Releases Cryptography for Quantum Computing
See https://www.axios.com/2024/08/13/nist-post-quantum-cryptography-encryption.
Quantum computing has been ten years away . . . for more than forty years. Besides the “no harm to update,” how much attention do we really need to give to quantum computing?
We have three different perspective on how much this "news" matters. As usual. What do you think?
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
00:00:00 --> 00:00:06 From somewhere deep in the cloud and the
00:00:06 --> 00:00:12 corners of the earth, this is the Killing It podcast, with a focus on helping you
00:00:12 --> 00:00:15 make sense and dollars of all things IT.
00:00:15 --> 00:00:21 With your hosts, Dave Sobel, Ryan Morris, and Karl Palachuk.
00:00:21 --> 00:00:29 Welcome, everybody, to the brand new episode 209 of the Killing It, Killing It.
00:00:29 --> 00:00:31 Podcast. You You guys are actually in sync.
00:00:31 --> 00:00:33 I'm not going to have to do anything to that.
00:00:33 --> 00:00:34 This is instinctual at this point.
00:00:34 --> 00:00:36 We do it so well.
00:00:36 --> 00:00:37 Smooth professionals.
00:00:37 --> 00:00:40 Probably did it in your sleep last night.
00:00:40 --> 00:00:43 Guys, I'm going to just dive right in because I'm excited to talk
00:00:43 --> 00:00:46 to you about our topics. I want to warm us up a little bit.
00:00:46 --> 00:00:47 We're coming off the Olympics.
00:00:47 --> 00:00:53 If you could actually compete at an Olympic level, what athletic
00:00:53 --> 00:00:54 event would it be?
00:00:55 --> 00:00:56 I am super non-athletic.
00:00:57 --> 00:01:01 I've never actually been able to run a block.
00:01:02 --> 00:01:06 But I think if I were in the Olympics, I would want to do archery
00:01:06 --> 00:01:11 just because it would be fun to actually be good at something like that.
00:01:11 --> 00:01:16 I can shoot an arrow, I can't hit anything.
00:01:16 --> 00:01:20 They always say, If you miss the target, it's not the target's fault.
00:01:20 --> 00:01:24 That defines my business.
00:01:24 --> 00:01:31 See, and if one of the options is not becoming a professional Olympic Olympic
00:01:31 --> 00:01:36 watcher, which, by the way, I did in the last couple of weeks, it
00:01:36 --> 00:01:38 was very compelling stuff.
00:01:38 --> 00:01:42 And by the way, super kudos to Paris for the way that they did the locations
00:01:43 --> 00:01:44 and the venues and everything.
00:01:44 --> 00:01:48 Seriously, that was cooler than any other Olympics.
00:01:48 --> 00:01:53 But I would still go back to where I am from, which is the soccer stuff.
00:01:53 --> 00:01:58 I was actually, as a teen, coming up in a development program, Olympic track
00:01:58 --> 00:02:01 for the U23 Men's Soccer Tournament.
00:02:01 --> 00:02:03 Now, I missed it because I wasn't good enough.
00:02:03 --> 00:02:09 But if I could overcome that in my youth, one only can't imagine what
00:02:09 --> 00:02:11 the trajectory would have been.
00:02:11 --> 00:02:14 So the Men's Soccer team, they need some help.
00:02:14 --> 00:02:17 Let's just admit it. They need some help right now.
00:02:17 --> 00:02:19 They need to take some lessons from the women's soccer.
00:02:19 --> 00:02:21 You might actually be the closest to being able to compete at that level
00:02:21 --> 00:02:24 than many of the other sports. I'm with you, Ryan.
00:02:24 --> 00:02:28 Gold Zone was the greatest invention ever for peak.
00:02:28 --> 00:02:32 I got it like the ability to watch match all the sport all day long.
00:02:32 --> 00:02:37 As they moved quickly around, and here's this one, and here's this one, and here's
00:02:37 --> 00:02:38 gold over here, and here's gold over here.
00:02:38 --> 00:02:41 I was addicted, and it was too fun.
00:02:41 --> 00:02:45 But if you were actually- Oh, by the way, it's in front of the Eiffel Tower.
00:02:45 --> 00:02:46 It was pretty cool.
00:02:46 --> 00:02:50 When I'm picking a sport, I'm going to go with distance cycling because
00:02:50 --> 00:02:52 that feels the most useful.
00:02:52 --> 00:02:58 Because not only could I actually do it at that level, I could then also commute
00:02:58 --> 00:02:59 locally in really useful ways.
00:02:59 --> 00:03:02 It feels like it would be a really cool sport.
00:03:02 --> 00:03:02 Right.
00:03:02 --> 00:03:05 You can get back to pedaling your electric bike.
00:03:05 --> 00:03:07 I know. Exactly.
00:03:07 --> 00:03:10 I want to take time off E-bike.
00:03:10 --> 00:03:15 And I'm pretty sure Carl's skill is also tactically applicable.
00:03:15 --> 00:03:20 So in a world, I'm thinking being able to play soccer, that's cute.
00:03:20 --> 00:03:23 But archery would get you some kudos.
00:03:23 --> 00:03:26 On occasion, you really need that skill.
00:03:26 --> 00:03:28 All righty.
00:03:28 --> 00:03:31 Well, let's dive into we have three topics As usual.
00:03:31 --> 00:03:35 So the first topic we're going to dig into today is MIT.
00:03:35 --> 00:03:37 We're going to put a link in the show notes.
00:03:37 --> 00:03:43 But MIT has released an AI Risk Repository, which is
00:03:43 --> 00:03:50 literally 700 articles and published papers on risks that you need to be
00:03:50 --> 00:03:53 aware of with artificial intelligence.
00:03:53 --> 00:03:59 The cool thing about this is if you go to the link that we give you, it's at MIT at
00:03:59 --> 00:04:06 the university, and they allow you to download this as an Excel spreadsheet.
00:04:06 --> 00:04:13 So if you want to do your own analysis, sort, filter, whatever, do counts on
00:04:13 --> 00:04:17 keywords, anything you want to do.
00:04:17 --> 00:04:24 What I love about this is it's an academic view of something that is what we in
00:04:24 --> 00:04:26 the nerd world would call open source.
00:04:26 --> 00:04:32 So here you have this open database, and you can decide for yourself
00:04:32 --> 00:04:36 whether or not it's useful. Now, who is it useful to?
00:04:36 --> 00:04:41 Well, it is 10% of this audience and 0% of your clients.
00:04:41 --> 00:04:43 Oh, I disagree completely.
00:04:43 --> 00:04:46 Okay, this is good when we disagree.
00:04:46 --> 00:04:53 We're in the golden age of NIST and academic frameworks that I feel
00:04:53 --> 00:04:56 are incredibly useful in business.
00:04:56 --> 00:05:02 Again, I'm going to fall back on my basic concept of the true value of IT services
00:05:02 --> 00:05:05 organizations is in their expertise in guiding
00:05:05 --> 00:05:07 customers to the correct technologies.
00:05:07 --> 00:05:14 You have more than any time ever, academics and industry professionals who
00:05:14 --> 00:05:19 spend their entire lives thinking about this in a
00:05:19 --> 00:05:24 conceptual sense and give you all of the resources to be incredibly smart
00:05:24 --> 00:05:27 very quickly on a lot of topics.
00:05:27 --> 00:05:32 I look at a database like this and say, this is exactly the useful framework
00:05:32 --> 00:05:36 that I can then take to customers and apply when I need to consider
00:05:36 --> 00:05:38 and answer their questions.
00:05:38 --> 00:05:41 I don't have to talk in theoretical terms about what AI risk is.
00:05:41 --> 00:05:46 I have exact models and use cases to compare against to understand where the
00:05:46 --> 00:05:51 smartest people around who just spit their time thinking about this have given
00:05:51 --> 00:05:52 me all of the resources needed.
00:05:52 --> 00:06:01 You can get up to speed on practices at an incredibly rapid rate and leverage
00:06:01 --> 00:06:04 incredible intelligence quickly with your customers.
00:06:04 --> 00:06:09 What I want people to think is, I'm not saying I think everyone needs to go
00:06:09 --> 00:06:12 and learn all 700 and some scenarios.
00:06:12 --> 00:06:17 It's more the, you need to be a really good person
00:06:17 --> 00:06:23 at matching information to your customers, and you're given a searchable, indexable,
00:06:23 --> 00:06:27 sortable resource that you can use very quickly.
00:06:27 --> 00:06:31 I look at this and say, your job is knowing all these resources.
00:06:31 --> 00:06:35 These resources are available to you, and leveraging them makes you exceptional.
00:06:35 --> 00:06:38 To your point, Carl, is only 10% will use it?
00:06:38 --> 00:06:40 Yeah, those are the best 10%.
00:06:42 --> 00:06:48 Well, and those are the 10% who are going to get paid for their professional
00:06:48 --> 00:06:50 services and consulting advice.
00:06:50 --> 00:06:56 I have three comments about this that I think it is very interesting.
00:06:56 --> 00:07:01 Number one, thank you very much, MIT, for quantifying what has up until now been
00:07:01 --> 00:07:04 just a general sense of foreboding doom.
00:07:04 --> 00:07:08 Everybody has said, AI equals risk. What risk?
00:07:08 --> 00:07:12 Well, one, and two, and three, and holy cow, you can't imagine how many.
00:07:12 --> 00:07:16 Yes, we can because we're scientists and Because in a professional world, you can
00:07:16 --> 00:07:23 say there are not just a lot, there are a number, and that's very good.
00:07:23 --> 00:07:27 Number two, and by the way, there are at least 700 risks with AI.
00:07:27 --> 00:07:31 We all know that intuitively, but we need a number to work against.
00:07:31 --> 00:07:36 Number two, exactly to your point, Carl, very few customers are going to go into
00:07:36 --> 00:07:40 that and try to solve for these problems because it is the paralysis
00:07:40 --> 00:07:42 by analysis paradox.
00:07:43 --> 00:07:45 It's literally, I can be responsible.
00:07:45 --> 00:07:49 Holy cow, there's too many things to be responsible about.
00:07:49 --> 00:07:51 I'm just going to do what I was going to do anyway, and we're just going to
00:07:51 --> 00:07:53 hope that everything works out okay.
00:07:53 --> 00:07:58 Customers are not going to go in there and use this information.
00:07:58 --> 00:08:02 But point number three, I think, and if anybody would like to know how to do this,
00:08:02 --> 00:08:04 we would be happy to do this for you.
00:08:04 --> 00:08:12 You can package a service offering by industry, by customer size that says an
00:08:12 --> 00:08:16 AI risk assessment for your organization.
00:08:16 --> 00:08:21 I will examine the following X number of categories and specific use cases.
00:08:21 --> 00:08:27 I will identify actual behavior within your organization, a profile of what your
00:08:27 --> 00:08:32 business processes are, what is actually going on in the real world, too, because I
00:08:32 --> 00:08:35 have network tools that allow me to identify whether these tools are
00:08:35 --> 00:08:37 being used in your environment.
00:08:37 --> 00:08:41 And I can give you an action oriented report that says, here are your
00:08:41 --> 00:08:44 vulnerabilities, here are the action item recommendations, and here's exactly
00:08:44 --> 00:08:46 what you should do about that stuff.
00:08:46 --> 00:08:51 Oh, and by the way, if it's in the SMB world, that costs 5 to $10 for that
00:08:51 --> 00:08:55 level of work, and it's going to be highly templatized.
00:08:55 --> 00:09:00 If I'm in the mid-market, I'm talking 25 to 35, and it's
00:09:00 --> 00:09:01 going to require some interviews.
00:09:01 --> 00:09:07 If you happen to want to get involved with some divisions of larger enterprises in
00:09:07 --> 00:09:11 your local neighborhood, this is a great way to get your foot in the door and get a
00:09:11 --> 00:09:19 50 to 100K services gig that is, A, very timely, and B, once you've done it a few
00:09:19 --> 00:09:23 times and you've figured out what the templates are, it is oh, so infinitely
00:09:23 --> 00:09:28 repeatable that your margin shouldn't be 50% on that thing.
00:09:28 --> 00:09:31 They should be 75% on that thing.
00:09:31 --> 00:09:32 This is great.
00:09:32 --> 00:09:36 That, to me, makes the point that 10% of our audience
00:09:36 --> 00:09:41 is actually interested in going to all that work or do that a deep dive.
00:09:41 --> 00:09:48 I do think this is a rich, rich area to create a spectacular presentation on
00:09:48 --> 00:09:54 the dangers of AI that you can give in 20 minutes, not a super crazy deep dive, but
00:09:54 --> 00:09:58 just literally browse through this thing and say, Well, what are
00:09:58 --> 00:09:59 the concerns about privacy?
00:09:59 --> 00:10:01 What are the What are the concerns about misinformation?
00:10:01 --> 00:10:03 What are the concerns about bad actors?
00:10:03 --> 00:10:09 And get a few things where you can point to the references that are outlined
00:10:09 --> 00:10:12 because this is all referencing articles, referencing articles,
00:10:12 --> 00:10:14 referencing articles.
00:10:14 --> 00:10:19 You could repeat that to the Kauanis, the Rotary, the BNI, the Morning
00:10:19 --> 00:10:21 Chambers of Commerce.
00:10:21 --> 00:10:25 You can give this presentation every day for the next month, and you
00:10:25 --> 00:10:26 might get some clients out of it.
00:10:26 --> 00:10:27 Most people are not going to do it.
00:10:27 --> 00:10:28 Yeah, but we can say that about everything.
00:10:28 --> 00:10:30 We can say that about everything.
00:10:30 --> 00:10:32 They can say that everything. Most people will not execute.
00:10:32 --> 00:10:35 And that's why, in a certain degree, I don't mind talking about it
00:10:35 --> 00:10:36 because it's the...
00:10:36 --> 00:10:38 Look, I know a lot of people aren't going to do it.
00:10:38 --> 00:10:40 You don't want to be that person.
00:10:40 --> 00:10:42 You want to be the person that actually executes.
00:10:42 --> 00:10:46 And that's where the value is.
00:10:46 --> 00:10:50 Well, and to your point, Carl, if you're not going to package it the way I
00:10:50 --> 00:10:55 described and sell it as a professional service, please use it as a promotional
00:10:55 --> 00:11:00 vehicle to get into a conversation that then says, Oh, and by the way, we
00:11:00 --> 00:11:02 could monitor your network for you.
00:11:02 --> 00:11:04 You should sign a managed services contract.
00:11:04 --> 00:11:11 That, to me, is a very topical reality that almost everybody can use.
00:11:11 --> 00:11:14 But we're done giving you advice on that topic.
00:11:14 --> 00:11:18 Let's move on to our next one, guys.
00:11:18 --> 00:11:21 Ripped from the headlines of the most obvious news that any of us have
00:11:21 --> 00:11:25 encountered recently, Google has a monopoly.
00:11:25 --> 00:11:26 What?
00:11:26 --> 00:11:28 Say it in a sense.
00:11:28 --> 00:11:32 And not just a monopoly, but a monopoly in search.
00:11:32 --> 00:11:38 So the news, if you have not been keeping up with this, this is not immediate.
00:11:38 --> 00:11:41 This is not just a short term assessment where somebody woke up one day
00:11:41 --> 00:11:43 and decided to point fingers.
00:11:43 --> 00:11:49 This is years of analysis and legal exploration going into a scenario to
00:11:49 --> 00:11:54 examine the business behavior, not the prevalence of technology,
00:11:54 --> 00:11:59 but the business practices around the application, requiring it to be positioned
00:11:59 --> 00:12:06 as default, requiring compatibilities and embedded capabilities with their search
00:12:06 --> 00:12:09 engine in other people's software.
00:12:09 --> 00:12:11 It's not the technology.
00:12:11 --> 00:12:15 It is the business behavior around the technology that
00:12:15 --> 00:12:18 apparently has risen to a level.
00:12:18 --> 00:12:23 Now, we've been talking about some things that we believe are obvious
00:12:23 --> 00:12:28 monopolies in many dimensions of our industries for a number of years.
00:12:28 --> 00:12:33 This is the first time anybody with any authority actually said something about it
00:12:33 --> 00:12:35 since Microsoft.
00:12:35 --> 00:12:37 That's a million years ago.
00:12:37 --> 00:12:41 What do you guys think is actually going to come from this?
00:12:41 --> 00:12:45 And do you believe this is going to change anything about their behavior?
00:12:45 --> 00:12:48 I want to make predictions. I'm totally on board for this.
00:12:48 --> 00:12:53 So I'm going to say I don't think Google will get broken up.
00:12:53 --> 00:12:58 I think that is one of those moves that the government doesn't generally like to
00:12:59 --> 00:13:03 do if they don't have I'm not sure it's necessarily as clean.
00:13:03 --> 00:13:07 I could see a very clear argument for you could split main Google
00:13:07 --> 00:13:09 search with YouTube.
00:13:09 --> 00:13:13 It makes sense to me because you end up with a world of two search engines.
00:13:13 --> 00:13:14 But I don't actually...
00:13:14 --> 00:13:18 Because first, you immediate move would to make themselves search the other thing.
00:13:18 --> 00:13:22 It makes a lot of sense if you want to create a search engine bit.
00:13:22 --> 00:13:25 But I actually think they're going to be a little bit more cautious.
00:13:25 --> 00:13:30 I think the main thing I'm going to see out of this is a ban on
00:13:30 --> 00:13:36 these contracts where they're buying off someone else to not get in there.
00:13:36 --> 00:13:38 The obvious one is Apple.
00:13:38 --> 00:13:43 Google just writes a check to Apple so that they don't want this for placement as
00:13:43 --> 00:13:45 a search engine, but it also disincentivizes Apple
00:13:45 --> 00:13:46 from ever looking at that.
00:13:47 --> 00:13:51 I think they're going to clearly say, When you are at some measurable level of the
00:13:51 --> 00:13:55 dominant player, you cannot buy your way to hold on that.
00:13:56 --> 00:13:57 You've got to compete.
00:13:57 --> 00:14:01 I think that's going to be the obvious remedy that we'll see out of this.
00:14:01 --> 00:14:05 Whether or not that will inspire players like Apple to get into search
00:14:06 --> 00:14:09 will be an interesting play.
00:14:09 --> 00:14:16 I think the one thing that I'm looking for is a remedy that actually does create
00:14:16 --> 00:14:19 the opportunity to create new markets.
00:14:19 --> 00:14:24 And I'll be interested in, and by the way, just slowing Google down might do that.
00:14:24 --> 00:14:28 I think it's argument that their search results have been
00:14:28 --> 00:14:33 garbage for a while and that they are not delivering on the best product.
00:14:33 --> 00:14:39 If they're distracted by having to handle a bunch of legal stuff, it might open the
00:14:39 --> 00:14:42 opportunity for somebody else to do something there.
00:14:42 --> 00:14:45 But I think the obvious bit is we're going to say, No, you can't buy your
00:14:45 --> 00:14:47 way to the top of the market?
00:14:47 --> 00:14:48 A couple of things.
00:14:48 --> 00:14:56 First of all, thank goodness, the EU has done a better job of being effective
00:14:56 --> 00:15:02 with managing the big tech giants than the United States.
00:15:02 --> 00:15:06 I think that basically saying, No, you can't do that,
00:15:06 --> 00:15:09 has already set Google on the path to figuring out how they're
00:15:09 --> 00:15:10 going to handle this.
00:15:10 --> 00:15:14 I don't know what their answer is, but I know that Google knows
00:15:14 --> 00:15:15 what their answer is.
00:15:15 --> 00:15:17 I do think it's interesting.
00:15:17 --> 00:15:20 It's not illegal to have a monopoly.
00:15:20 --> 00:15:25 It is illegal to behave like a monopoly.
00:15:25 --> 00:15:27 It's like, Okay, here's a question.
00:15:27 --> 00:15:31 If you could choose any search engine in the world, which
00:15:31 --> 00:15:34 you can, would you choose Google?
00:15:34 --> 00:15:39 Today and day after day after day, I do.
00:15:39 --> 00:15:43 Not that it is the greatest I could possibly imagine.
00:15:43 --> 00:15:44 It's like the government.
00:15:44 --> 00:15:47 It's not the best democracy that can be made.
00:15:47 --> 00:15:50 It's the best that we will accept.
00:15:50 --> 00:15:54 I would like Google to be better in many ways.
00:15:54 --> 00:16:00 I think the interesting part is the look at Microsoft is probably a good example
00:16:00 --> 00:16:04 that what they really did is they looked at little things like,
00:16:04 --> 00:16:08 Well, you can't make a deal with IBM so that IBM has to pay you for an
00:16:08 --> 00:16:11 operating system, whether they install it or not.
00:16:11 --> 00:16:13 Like, holy crap.
00:16:13 --> 00:16:15 That's a genius It's not a great move if you're selling.
00:16:15 --> 00:16:19 It's not a great move if you're buying.
00:16:19 --> 00:16:23 And maybe they will figure out how to open it up and have some real competition.
00:16:23 --> 00:16:27 Apple would be smart to never get into the search business.
00:16:27 --> 00:16:32 So they're always going to buy a partner It's just a matter of, Okay, can somebody
00:16:32 --> 00:16:36 else offer them more money than Google or a better deal or something that fits
00:16:36 --> 00:16:42 better with their future AI offering or fits better with their ecosystem?
00:16:42 --> 00:16:48 What's the searching that works best for them, separate from everybody else?
00:16:48 --> 00:16:52 I would love to think that there's a future where we each have
00:16:52 --> 00:16:54 these bespoke search engines.
00:16:54 --> 00:16:58 I have what works for my business, you have what works for your business.
00:16:58 --> 00:17:02 There are lots of specialties search engines, but they're really hard to
00:17:02 --> 00:17:04 use unless you're in that specialty.
00:17:04 --> 00:17:09 Maybe we're going to force some innovation to other people as well.
00:17:09 --> 00:17:11 Before I let Ryan, I'm going to just make a quick comment.
00:17:11 --> 00:17:13 You asked a really good question like, would you choose something else?
00:17:13 --> 00:17:17 I'm finding that for at least 50% of my searches, I'm not choosing Google.
00:17:17 --> 00:17:18 I'm, in fact, choosing ChatGPT.
00:17:19 --> 00:17:23 The reason is I have a focus question.
00:17:23 --> 00:17:28 I generally know that it will be in the vast, broad, settled amount of knowledge.
00:17:28 --> 00:17:32 It's not something that this is just that is going to be variable or timely.
00:17:32 --> 00:17:34 And by the way, the fact that it gives me a couple of sources means I
00:17:34 --> 00:17:35 can double-check its work.
00:17:35 --> 00:17:39 It's an established thing that hasn't been known for 20 years.
00:17:39 --> 00:17:40 That's a great way of getting it.
00:17:40 --> 00:17:43 I'm 50% of my searches, I'm not choosing it.
00:17:43 --> 00:17:48 Well, see, and this is where if you go back to the fundamentals of
00:17:48 --> 00:17:52 why is monopoly behavior frowned upon?
00:17:52 --> 00:17:54 Why do we actually care about that?
00:17:54 --> 00:17:59 Well, it's because you use market position and money to substitute
00:17:59 --> 00:18:02 competition and innovation.
00:18:02 --> 00:18:08 If you have the very best product and you achieve a dominant market position because
00:18:08 --> 00:18:12 it is bigger, better, faster than anything else that's out there,
00:18:12 --> 00:18:17 congratulations to you and congratulations to me as the user because you've made my
00:18:17 --> 00:18:22 life better with a product that is demonstrably better than anything else.
00:18:22 --> 00:18:26 When you get to that position, and to your point, Carl, you begin to
00:18:26 --> 00:18:29 behave in a monopolistic way.
00:18:29 --> 00:18:33 What happens is you buy your market dominance instead of earn your market
00:18:33 --> 00:18:38 dominance, meaning you take your foot off the gas pedal of innovation and you cease
00:18:38 --> 00:18:43 to ship the best technology, which punishes not only the buyers, but
00:18:43 --> 00:18:45 also the actual customers, the users.
00:18:45 --> 00:18:48 That's exactly where Dave's example is going.
00:18:48 --> 00:18:58 The capabilities of Gen AI added to search makes the potential for that service, for
00:18:58 --> 00:19:02 that basic technology, dramatic radically better than anything we have been
00:19:02 --> 00:19:05 accustomed to in the last 20 years.
00:19:05 --> 00:19:08 Google has not done well with that.
00:19:08 --> 00:19:11 They have stumbled into a lot of the research that I'm reading in
00:19:11 --> 00:19:13 my own personal experience.
00:19:13 --> 00:19:16 There are many times I will ask a question in Google, and everything
00:19:16 --> 00:19:18 I get back is an ad.
00:19:18 --> 00:19:21 Literally everything I get back is an ad. Not cool.
00:19:21 --> 00:19:23 That's not what I'm looking for.
00:19:23 --> 00:19:28 Chatgpt has some reliability and hallucination problems, but it
00:19:28 --> 00:19:30 does give me a more robust answer.
00:19:30 --> 00:19:36 But Google still is the A number one by a long way in the marketplace because they
00:19:36 --> 00:19:41 forced it through monopolistic behavior rather than earned it through innovation.
00:19:41 --> 00:19:47 If this has that simple impact on the search market, it solves a problem.
00:19:47 --> 00:19:48 I'm all for it.
00:19:48 --> 00:19:53 If it sends a warning sign to others in other marketplaces to
00:19:53 --> 00:19:57 stop it with the buying dominance and go back to a world where you actually
00:19:57 --> 00:20:01 innovate and earn, that That would be a pipe dream.
00:20:01 --> 00:20:03 I don't think we're going to get there yet.
00:20:03 --> 00:20:06 It'll take some more wraps on the knuckle before anybody else
00:20:06 --> 00:20:08 actually changes behavior.
00:20:08 --> 00:20:10 But this is a good start.
00:20:10 --> 00:20:13 I would say, just for a note, Google has made a pretty good case.
00:20:13 --> 00:20:19 They do continue to innovate, and my number two search engine is YouTube.
00:20:20 --> 00:20:21 Maybe they will get programed.
00:20:22 --> 00:20:25 But I'm going to move us on to the last topic, and I want to take an
00:20:25 --> 00:20:26 interesting angle on this.
00:20:26 --> 00:20:32 Nist just released their guidance for cryptography for quantum computing.
00:20:32 --> 00:20:36 I will say, look, it's an interesting set of framework for those that
00:20:36 --> 00:20:38 are in the encryption space.
00:20:38 --> 00:20:40 Get to work, guys, because you've got new data.
00:20:40 --> 00:20:45 But what I wanted to do is I actually rejected this story for the business of
00:20:45 --> 00:20:49 tech, and I wanted to bring it to you guys instead because I looked at this and said,
00:20:49 --> 00:20:52 okay, there's the obvious element of upgrade your cryptography
00:20:52 --> 00:20:54 when there's a better version.
00:20:54 --> 00:20:56 That just always makes sense.
00:20:56 --> 00:21:02 But I feel like quantum computing as a thing has always been, well, we're within
00:21:02 --> 00:21:07 10 years of it, every three years, as far as I can remember now at this point.
00:21:07 --> 00:21:12 I mean, it just feels like we're so close to quantum that I've come to the point
00:21:12 --> 00:21:15 where I think I just don't believe them.
00:21:15 --> 00:21:20 I think I just don't think this is a thing.
00:21:20 --> 00:21:26 I will open a space for, sure, in the 27th century, when the next version of Star
00:21:26 --> 00:21:29 Trek is out, we may have quantum computing.
00:21:29 --> 00:21:33 But I don't think in any practical terms, this is a thing.
00:21:33 --> 00:21:36 Am I missing the boat? I want to get check here.
00:21:36 --> 00:21:37 What's your take on quantum?
00:21:38 --> 00:21:43 So much technology, and we've seen this just recently in the
00:21:43 --> 00:21:45 pandemic where we're saying, Oh, how come robots aren't taking over the world?
00:21:45 --> 00:21:51 Well, because in the real world, people are saying, Well, I'll work for a
00:21:51 --> 00:21:53 dollar less if you don't take my job away.
00:21:53 --> 00:21:57 And so it's delayed the actual use of that technology.
00:21:57 --> 00:22:03 I think things like the fact that we have GPUs and we have
00:22:03 --> 00:22:09 these processors that can just put more horsepower on a problem, we don't
00:22:09 --> 00:22:12 have a need for quantum computing.
00:22:12 --> 00:22:16 It's not that it's not real or it's not there or it's not going to happen.
00:22:16 --> 00:22:22 If it doesn't happen, it'll be because we don't need it, because we can
00:22:22 --> 00:22:24 continually increase horsepower.
00:22:24 --> 00:22:28 Even though we don't increase the horsepower necessarily with one chip,
00:22:28 --> 00:22:34 having more and more processors, but we can now buy NVIDIA chips
00:22:34 --> 00:22:36 by the gallon, right?
00:22:36 --> 00:22:41 We'll just buy more horsepower, and then they get smaller and smaller and smaller.
00:22:42 --> 00:22:45 See, and I will go to the next order of impact.
00:22:45 --> 00:22:50 Because what Carl is saying is, if I have an alternative technology that I
00:22:50 --> 00:22:56 can use to solve the future problem, it's existing, it's tested, it's understood.
00:22:56 --> 00:22:57 Let's just do what's familiar.
00:22:57 --> 00:23:03 I'll go to the next level and say that I I think the thing that will prevent rapid
00:23:03 --> 00:23:07 deployment of quantum is the environmental impact.
00:23:07 --> 00:23:11 And I don't mean just on trees and waterways and cute
00:23:11 --> 00:23:11 little bunnies in nature.
00:23:11 --> 00:23:17 What I mean is the literal environment in which these systems play.
00:23:17 --> 00:23:22 The power consumption requirements for quantum computing are projected because
00:23:23 --> 00:23:24 nobody really has one of these things yet.
00:23:24 --> 00:23:29 If you're going to use one of these that does what it is scientifically suggested
00:23:29 --> 00:23:35 it could do, it's going to consume power at a rate beyond even what these data
00:23:35 --> 00:23:39 centers are doing for AI, which we know is an order of magnitude beyond
00:23:39 --> 00:23:42 what regular data centers did.
00:23:42 --> 00:23:47 We already covered, years ago, if you guys remembered, a story about
00:23:47 --> 00:23:53 the health impacts for people living in proximity of large format data centers.
00:23:53 --> 00:23:57 The hum that happens 24/7 leads to hearing problems.
00:23:57 --> 00:24:00 It creates sleeping problems, et cetera.
00:24:00 --> 00:24:06 Order of magnitude to AI, further order of magnitude up to the consumption of
00:24:06 --> 00:24:08 what quantum computing is going to be.
00:24:08 --> 00:24:11 I don't think we can't innovate out of that.
00:24:11 --> 00:24:17 I think that the world will look around and go, I'm not willing to pay you the
00:24:17 --> 00:24:22 extra second order costs in order to consume this technology.
00:24:22 --> 00:24:27 I will buy a quantum computer for X, but then in order to cool
00:24:27 --> 00:24:32 it, to power it, in order for it to live in a place where it doesn't poison the
00:24:32 --> 00:24:36 local citizenry, it's going to cost me X times four.
00:24:36 --> 00:24:38 I'm not willing to pay the X times four.
00:24:38 --> 00:24:40 So thank you very much.
00:24:40 --> 00:24:42 I'll just stick with what I'm doing right now.
00:24:42 --> 00:24:46 I think the capability, the potential, the stories we've been told
00:24:46 --> 00:24:53 about quantum, yes, they are fascinating, but they are not yet economically viable.
00:24:53 --> 00:24:55 And that's an engineering problem.
00:24:55 --> 00:24:56 That's not a scientific problem.
00:24:56 --> 00:24:58 That's an engineering problem.
00:24:58 --> 00:25:03 You need to figure out a way to deploy power without consumption.
00:25:03 --> 00:25:08 That's a single variable that you need to control for in designing and
00:25:08 --> 00:25:09 deploying these kinds of systems.
00:25:09 --> 00:25:11 By the way, one last comment.
00:25:11 --> 00:25:18 If quantum does happen, all your cybersecurity is grandma stuff.
00:25:18 --> 00:25:19 It is outdated.
00:25:19 --> 00:25:24 It is the olden times, and it does not work in a quantum world.
00:25:24 --> 00:25:30 We do not presently possess cryptography that can withstand
00:25:30 --> 00:25:34 quantum attacks for longer than just a couple of minutes.
00:25:34 --> 00:25:37 We're literally not using technology.
00:25:37 --> 00:25:41 So if and when quantum comes, I hope you all are thinking about the
00:25:41 --> 00:25:44 cyber impacts because it's going to break all your tools.
00:25:44 --> 00:25:47 Hold on. I want to break that sentence down.
00:25:47 --> 00:25:49 First off, when is the question?
00:25:49 --> 00:25:52 And I'm coming to the conclusion of it isn't.
00:25:52 --> 00:25:57 And if is doing a whole lot of heavy lifting right there, Ryan, because
00:25:57 --> 00:26:01 if when is never, if doesn't matter.
00:26:01 --> 00:26:04 That's your sentence diagramming by the way.
00:26:04 --> 00:26:08 I'm just getting break down to the basics here and observing it.
00:26:08 --> 00:26:09 Okay, sure.
00:26:09 --> 00:26:13 I hear you on all of this stuff, but I'm coming to the conclusion
00:26:13 --> 00:26:16 that this isn't a thing.
00:26:16 --> 00:26:24 In fact, head cycle spent on it in any level is just not worth my time.
00:26:24 --> 00:26:29 Now, I will give a space of, Look, I think making things
00:26:29 --> 00:26:32 generally more secure, and I'm putting that in big old air
00:26:32 --> 00:26:33 quotes, is a good thing.
00:26:33 --> 00:26:38 My statement is, Okay, if you've designed a new set of encryption that is
00:26:38 --> 00:26:43 invulnerable to this theoretical thing that is way more advanced than our regular
00:26:43 --> 00:26:47 stuff, well, then it should also be good enough for our regular stuff, and I
00:26:47 --> 00:26:50 don't see any reason to not do it.
00:26:50 --> 00:26:55 But if somebody says, Dave, I want you to spend time and brain cycles on this thing
00:26:55 --> 00:26:58 that only benefits quantum computing, I think my answer is,
00:26:58 --> 00:27:03 That's a waste of my time, and I'm not doing any work on it because
00:27:03 --> 00:27:06 you don't even prove you can make the thing.
00:27:06 --> 00:27:10 Like, literally, the basics of making it have not even been proven.
00:27:10 --> 00:27:15 So to Ryan, I would say, literally, all of your objections, old man,
00:27:15 --> 00:27:19 can be overcome and will be overcome by time and technology.
00:27:19 --> 00:27:22 Everything starts out being too expensive.
00:27:22 --> 00:27:23 Everything's impossible.
00:27:23 --> 00:27:24 Everything costs too much.
00:27:24 --> 00:27:31 If you think of how many BTUs did it take to light a house 100 years ago, well, you
00:27:31 --> 00:27:33 got to get the kerosine, but you have to get the kerosine to the
00:27:33 --> 00:27:35 house and like, holy smokes.
00:27:35 --> 00:27:39 Now you have LED lights that take essentially nothing except
00:27:39 --> 00:27:44 static electricity. So the future will take care of itself.
00:27:44 --> 00:27:51 Today, if I would say, I have been hearing about AI since before I was born.
00:27:51 --> 00:27:56 Like, literally, you watch old black and white science fiction movies and TV shows.
00:27:56 --> 00:28:00 They've been talking about AI for 60 years.
00:28:00 --> 00:28:03 And then one day, it became reality.
00:28:03 --> 00:28:08 Everybody, five years ago, you could have made the argument, They've been talking
00:28:08 --> 00:28:10 about AI for 50 years, and it's never going to happen.
00:28:10 --> 00:28:17 I think it's going to be the exact same thing with quantum computing.
00:28:18 --> 00:28:22 Part of it is with AI, we still haven't got, we haven't figured
00:28:22 --> 00:28:23 out the killer app.
00:28:23 --> 00:28:30 We haven't figured out a thing that's the email equivalent of the web.
00:28:30 --> 00:28:33 What is it that we would do with quantum computing?
00:28:33 --> 00:28:35 What's the killer wrap of quantum computing?
00:28:35 --> 00:28:41 Well, if you're not a quantum physicist or you're not a mathematician,
00:28:41 --> 00:28:43 I'm not sure there's a use for it outside of your paycheck.
00:28:44 --> 00:28:45 Well, and that's the thing, right?
00:28:45 --> 00:28:46 There are use cases.
00:28:46 --> 00:28:50 They're just not economically viable yet, right?
00:28:50 --> 00:28:51 I agree with you, Carl.
00:28:51 --> 00:28:57 The ecosystem of technology will solve for the problems that I've been outlining.
00:28:57 --> 00:28:59 That sounds like opportunity, and it sounds like business
00:28:59 --> 00:29:01 development to me, right?
00:29:01 --> 00:29:03 From an industry perspective.
00:29:03 --> 00:29:08 My thing is, I do believe that the future takes forever
00:29:08 --> 00:29:10 until it happens all at once.
00:29:10 --> 00:29:16 Exactly as Carl is describing, it will come to Dave's point, not this year.
00:29:16 --> 00:29:19 If you're focused on this year, please don't spend any time building a
00:29:19 --> 00:29:21 quantum practice, but it will come.
00:29:21 --> 00:29:23 We just need to be ready for it.
00:29:23 --> 00:29:27 Well, I want to push back slightly there, Carl, because while you're right on the AI
00:29:27 --> 00:29:32 stuff, we have seen advancements advancements along the way
00:29:32 --> 00:29:34 for a good portion of that.
00:29:34 --> 00:29:37 We could go back in time, let's say 2015.
00:29:37 --> 00:29:42 I worked on projects that were advanced, that were data science stuff, that was
00:29:42 --> 00:29:45 using machine learning, that was a stretch to be AI.
00:29:45 --> 00:29:46 That had the glimmers of that.
00:29:46 --> 00:29:50 You could go back further than that and you could see versions where we're
00:29:50 --> 00:29:54 building the systems that communicate that.
00:29:54 --> 00:29:58 I have this element of I saw a trend line over time of advancements,
00:29:58 --> 00:29:59 and you're right.
00:29:59 --> 00:30:01 All of a sudden, we see the breakthrough and it crashes through.
00:30:01 --> 00:30:06 What I'm pushing back on is, they don't even have the basic bits over on quantum.
00:30:06 --> 00:30:10 It's all whiteboard stuff where nothing actually works.
00:30:10 --> 00:30:15 There's a difference between seeing elements of it and You guys are having
00:30:15 --> 00:30:17 fun on a whiteboard, and that's all cool.
00:30:17 --> 00:30:22 But until you can actually build any of it, stop wasting my time.
00:30:22 --> 00:30:27 Maybe AI will be the tool that productises quantum.
00:30:27 --> 00:30:30 Perhaps. That replaces itself with quantum.
00:30:30 --> 00:30:35 Well, when the word theoretical is built into
00:30:35 --> 00:30:38 your definition, maybe you'll never show up in the real world.
00:30:38 --> 00:30:40 Maybe you'll never show up.
00:30:40 --> 00:30:49 And with that happy note, we bring an end to episode 209 of the Killing It...
00:30:49 --> 00:30:51 Podcast.
00:30:51 --> 00:30:54 Thanks for tuning in to the Killing It podcast.
00:30:54 --> 00:30:59 Please share with your friends and tell everyone to subscribe on iTunes,
00:30:59 --> 00:31:03 Stitcher, all the podcast places.
00:31:03 --> 00:31:07 Join us next week and help us keep killing it in the technology business.

