July 14, 2021 xerxes

About

Kai Gondlach

Mission-driven and future-enthusiastic describes him best; Kai’s scientific backed futures scenarios are contagious. Starting off his entrepreneurial first steps as a web designer in the mid-2000s, Kai then studied sociology and political sciences and later did his master degree in futures research. After working for and with hundreds of companies and public institutions, he still is convinced that change starts with the inspirational spark in the eyes of his audiences or workshop attendees.

He’s both an academic in the field of futures science & futures literacy and a professional keynote speaker, incl. TEDx.

Links

Website: www.kaigondlach.de / www.futurologist.de

Linkedin: https://www.linkedin.com/in/zukunftsforscher-gondlach/

Podcast (German): “Im Hier und Morgen“ on all platforms or visit https://www.kaigondlach.de/category/podcast/

Transcript of the Interview

This text has been auto-transcripted. Please excuse mistakes.

1
00:00:02,580 –> 00:00:05,970
Welcome to challenging paradigm X.

2
00:00:06,390 –> 00:00:11,820
What is futurology and is the study of futures at all?

3
00:00:11,850 –> 00:00:15,057
Scientific is being agile.

4
00:00:15,087 –> 00:00:17,517
The opposite of being resilient.

5
00:00:18,884 –> 00:00:25,958
What are the biggest challenges when it comes to
initiating the transformation and organizations?

6
00:00:26,740 –> 00:00:35,080
Can we just continue exploiting our planet as artificial
intelligence will eventually become our savior.

7
00:00:35,917 –> 00:00:40,657
And what opportunities does the
Corona pandemic create for us?

8
00:00:41,613 –> 00:00:43,983
My guest today is gondola.

9
00:00:44,913 –> 00:00:50,073
He is a mission-driven and future
enthusiastic futurologists.

10
00:00:50,403 –> 00:00:56,703
That means he is a futures scientist in
his case, with a background in sociology,

11
00:00:56,783 –> 00:01:00,773
political sciences, and futures research.

12
00:01:01,182 –> 00:01:10,272
He has experience as an entrepreneur consults organizations
and still does academic work in the field of the futures.

13
00:01:11,020 –> 00:01:17,950
Apart from that, he works as a professional keynote
speaker, including a TEDx talk and has his own podcast.

14
00:01:18,547 –> 00:01:25,857
So if you’re interested in future topics
with a positive outlook, Stay tuned.

15
00:01:28,520 –> 00:01:29,950
Today, I’m here with Kai.

16
00:01:30,790 –> 00:01:31,720
Good to have you here.

17
00:01:32,200 –> 00:01:32,770
Thanks for having me.

18
00:01:33,460 –> 00:01:34,900
Please send, produce yourself.

19
00:01:34,930 –> 00:01:36,310
Where are you and what do you do?

20
00:01:37,080 –> 00:01:37,620
Who are you?

21
00:01:37,620 –> 00:01:38,190
Who am I?

22
00:01:38,250 –> 00:01:40,110
One of the biggest questions I think in the universe.

23
00:01:40,110 –> 00:01:40,290
Right?

24
00:01:40,650 –> 00:01:41,400
Why am I here?

25
00:01:42,730 –> 00:01:47,770
I have been born and raised in the very north of
Germany, but actually my parents didn’t come from there.

26
00:01:47,770 –> 00:01:54,340
But I think what describes me the best is I’m a very
enthusiastic person when it comes to future questions.

27
00:01:54,730 –> 00:01:56,170
And that’s part of my job description as well.

28
00:01:56,170 –> 00:01:59,470
So I’m a, futurologist a futures scientist, if you will.

29
00:01:59,980 –> 00:02:05,980
And I like both the technological stuff, as
well as the societal stuff, the political stuff.

30
00:02:06,280 –> 00:02:10,810
And that’s why I’m quite constantly engaged in researching.

31
00:02:11,385 –> 00:02:16,125
Topics or reading studies, writing studies,
reading, blog posts, doing my own podcast.

32
00:02:16,455 –> 00:02:16,905
Essentially.

33
00:02:16,905 –> 00:02:23,565
I think I like to think about futures and visions
and I’d like to anticipate basically all of that.

34
00:02:23,565 –> 00:02:32,295
And why are there political parties, for example, that
try to get back to like pre-modern times basically, but

35
00:02:32,325 –> 00:02:38,265
why is there also so many constraints in the society
that try to undermine basically the visionary parts?

36
00:02:39,490 –> 00:02:44,080
What futurists or futurologists
like us basically do and talk about.

37
00:02:44,410 –> 00:02:49,810
So it’s interesting to study humankind basically,
and with a certain perspective on non futurists.

38
00:02:49,870 –> 00:02:50,590
That’s what I do.

39
00:02:50,890 –> 00:02:51,280
Okay.

40
00:02:51,370 –> 00:02:52,870
And so why do you do what you do?

41
00:02:53,910 –> 00:02:55,950
I think because it’s fun, to be honest.

42
00:02:56,800 –> 00:03:03,750
I think when I started working or being an employee some 10
years ago, I was thrown and basically after my bachelor’s

43
00:03:03,780 –> 00:03:08,380
thesis, Labor market with employers and the job market.

44
00:03:08,380 –> 00:03:12,910
And you have to apply for many, many jobs to get
one feedback or so, and then you get into the

45
00:03:12,910 –> 00:03:19,240
machinery and then use some somehow realize that
building your own career or whatsoever always

46
00:03:19,240 –> 00:03:24,250
involves outsmarting or out competing other people.

47
00:03:24,610 –> 00:03:30,640
And of course, I mean growing or you’re on your own, but
then I, I had some, one or two years in a very big company.

48
00:03:31,840 –> 00:03:33,370
That’s 30 a corporation.

49
00:03:34,000 –> 00:03:37,900
And I got to know all the ups and downs
of the people who were working there.

50
00:03:37,990 –> 00:03:44,350
Some only been there for a couple of years and some of them
live for 30 years and already expecting their pension plans.

51
00:03:44,710 –> 00:03:50,140
And what’s drove me with the most, I
think was that many of them didn’t really.

52
00:03:51,220 –> 00:03:53,950
A lot of what they were doing, they were good at their job.

53
00:03:53,980 –> 00:03:54,970
No question at all.

54
00:03:55,120 –> 00:03:58,870
And some of them had a small path
of corporate career behind them.

55
00:03:59,020 –> 00:04:03,160
Some of them were like looking into the
future, like to get the next level and so on.

56
00:04:03,940 –> 00:04:12,190
But I rarely encountered someone who was really like
purposefully doing and loving what they’re doing.

57
00:04:12,760 –> 00:04:12,970
Yeah.

58
00:04:13,000 –> 00:04:18,280
Then I found out that you could study
a future science or a futures research.

59
00:04:18,985 –> 00:04:24,805
And I thought that sounds so cool because me being
someone who studied social sciences and political

60
00:04:24,805 –> 00:04:30,535
sciences before I knew that there’s more to be
done, know more, to be like researched or focused

61
00:04:30,535 –> 00:04:33,235
on, then just like to look back in history.

62
00:04:33,265 –> 00:04:35,425
And I love history and Emma’s totally history, dude.

63
00:04:35,665 –> 00:04:40,885
But on the other hand, most of the studies
I encountered back then were always like

64
00:04:40,885 –> 00:04:42,505
describing what already had happened.

65
00:04:43,210 –> 00:04:48,610
Long ago, and then not even like deriving
from that, what was really needed to be done.

66
00:04:49,000 –> 00:04:53,350
So getting tips for politicians or for
the decision makers in the economy.

67
00:04:53,770 –> 00:04:54,550
And then I, yeah.

68
00:04:54,550 –> 00:04:58,960
Then I found out that you could study that if
you just research program in Berlin, what I

69
00:04:58,960 –> 00:05:01,420
did and I loved it because it was like, so.

70
00:05:02,035 –> 00:05:03,325
Interdisciplinary.

71
00:05:03,625 –> 00:05:11,185
So we learned so many things about so many industries
or studies, like for instance, economical studies

72
00:05:11,185 –> 00:05:18,115
or social studies, then pedagogic studies, or
I don’t know, everything studies a lot of which

73
00:05:18,115 –> 00:05:20,575
involved sustainability studies or gender studies.

74
00:05:20,755 –> 00:05:23,895
So it was like such a pack plan in two years only.

75
00:05:24,135 –> 00:05:29,595
And after that we got like the certificate, which said
you are now the master of the future basically, but okay.

76
00:05:29,725 –> 00:05:30,825
Not really, but a master of.

77
00:05:31,510 –> 00:05:32,380
Uh, future science.

78
00:05:33,010 –> 00:05:40,060
And I loved that because there were so many new like
possibilities or opportunities for me to realize what

79
00:05:40,060 –> 00:05:47,830
I lost, which was, or still is not just researching
on past events, but also think what could happen

80
00:05:47,830 –> 00:05:54,820
next and what could the development be like in
10 or 50 or a hundred years, and then back cast,

81
00:05:55,030 –> 00:05:58,600
what needs to be done to achieve one desirable?

82
00:05:59,785 –> 00:06:06,535
Because I think that’s one, one very important
aspect of futures studies that you have very

83
00:06:06,535 –> 00:06:13,135
few people, I think know it’s not just about
researching very objectively what could happen.

84
00:06:13,135 –> 00:06:13,345
Yeah.

85
00:06:13,345 –> 00:06:19,645
That’s one very important part, but necessarily
it’s so important to anticipate what you.

86
00:06:20,320 –> 00:06:22,090
Would like to happen next.

87
00:06:22,450 –> 00:06:27,820
And we as a society, every group, every system, every
societal system, and it starts with my own mind,

88
00:06:28,120 –> 00:06:36,280
everything like it’s a part or a subject of ambiguity and
complexity and so much change and volatility and all that.

89
00:06:36,520 –> 00:06:36,790
Of course.

90
00:06:37,750 –> 00:06:44,140
The fact that the fun fact, and also the challenging
fact is that barely you ever meet someone, even an

91
00:06:44,140 –> 00:06:51,520
organization or a small group of people, let it be a family
or so who really has thought about their own futures.

92
00:06:52,300 –> 00:06:54,370
And I deliberately say futures, not just futures.

93
00:06:55,060 –> 00:06:56,950
Every one of us thinks about future all the time.

94
00:06:57,220 –> 00:07:02,920
As soon as we stand up from bed in the morning, or we
leave the house to go to the groceries, grocery stores

95
00:07:02,920 –> 00:07:08,500
to the market, or we meet up with friends in the evening,
which is possible that we’re finally, again, we always

96
00:07:08,500 –> 00:07:10,690
anticipated what could happen in the first place.

97
00:07:10,690 –> 00:07:14,770
We, we dress ourselves up, we need to wear clothes.

98
00:07:15,010 –> 00:07:19,780
Sometimes we take an umbrella because we looked
up at the weather forecast and it said it there’s

99
00:07:19,780 –> 00:07:22,300
a likelihood that it’s going to rain tonight.

100
00:07:22,300 –> 00:07:23,260
And so we are prepared.

101
00:07:24,640 –> 00:07:32,620
But the more time you go into the future and the bigger
the systems are, obviously, but not really, obviously

102
00:07:32,620 –> 00:07:38,860
because we too less, we anticipate that really as a
society, um, then it gets complex and complicated.

103
00:07:39,190 –> 00:07:42,100
And of course it’s not, there’s not like the equivalent.

104
00:07:43,030 –> 00:07:46,870
To the umbrella in the evening for a society in 10 years.

105
00:07:47,260 –> 00:07:56,470
So let it be, or for example, the Luxilight climate crisis,
I think 99, 9% of scientists agree that we have a climate

106
00:07:56,470 –> 00:08:01,690
crisis, which has been like strongly made by humankind.

107
00:08:01,690 –> 00:08:03,220
Of course, let’s look the Anthropocene.

108
00:08:04,675 –> 00:08:11,185
So we agree on that fact, but it’s so difficult to
agree on which measures to be taken, to like really

109
00:08:11,185 –> 00:08:16,105
find out what is happening right now in the Arctic for
the glaciers, for example, or for the rising sea levels

110
00:08:16,105 –> 00:08:19,675
or for environmental refugees and so on and so forth.

111
00:08:19,825 –> 00:08:26,515
It’s really difficult to agree on an agenda
to avoid certain scenarios to take place.

112
00:08:26,875 –> 00:08:32,005
And that’s basically one of one, one of
the most political aspects of my job.

113
00:08:33,340 –> 00:08:41,560
Support those organizations, that institutions
that are there to formulate to, to create, or

114
00:08:41,560 –> 00:08:44,040
to decide on what actually needs to happen.

115
00:08:44,040 –> 00:08:49,170
And that’s a very interesting job because I have to,
or I get to meet interesting people in institutions.

116
00:08:49,170 –> 00:08:56,000
I think we’re in the same boat here, but on the other
hand, it’s sometimes I would call it like very high.

117
00:08:56,920 –> 00:09:03,640
Or a strong responsibility because on the, like on
the screen, people were listening to us right now.

118
00:09:03,730 –> 00:09:05,890
Also see us in the entertaining mode.

119
00:09:06,070 –> 00:09:11,770
And of course this part of the job to deliver
the scientific facts and visions of the futures.

120
00:09:11,770 –> 00:09:12,250
Of course.

121
00:09:12,520 –> 00:09:18,730
But on the other hand, it’s so important that those are the
people who listen to us and take action after that, that the

122
00:09:18,730 –> 00:09:24,700
right ethically right or correct stuff is being delivered.

123
00:09:26,155 –> 00:09:36,145
Please tell me, did you, along your studies or beforehand
or afterwards, have any epiphanies or turning points

124
00:09:36,175 –> 00:09:45,205
that would lead you to do what you do now that would
change your perspective drastically or your perception?

125
00:09:46,645 –> 00:09:47,545
Yes, numerous.

126
00:09:47,545 –> 00:09:54,025
I think so the first thing I think is that the
constellation of my family members probably it

127
00:09:54,025 –> 00:09:58,405
happens to be that my father is pretty old for
my age, or I am pretty young for being his son.

128
00:09:58,705 –> 00:10:03,925
I will, he was 53 years old when I was
born, which leads to the situation that.

129
00:10:04,450 –> 00:10:08,110
Basically have a father who’s the same
age as the grandfather of my friends.

130
00:10:08,710 –> 00:10:12,730
So we have an intergenerational education
basically in the family, which is great.

131
00:10:12,940 –> 00:10:16,390
So he was, was experiencing the second world war.

132
00:10:16,720 –> 00:10:17,260
Imagine that.

133
00:10:17,290 –> 00:10:19,510
So he was born in 1934.

134
00:10:20,290 –> 00:10:26,560
So he was already like conscious when all
that shit happened, being born in the west of

135
00:10:26,560 –> 00:10:29,560
Germany and then grown up in what is today?

136
00:10:29,710 –> 00:10:29,830
The.

137
00:10:31,375 –> 00:10:32,215
At least for a few years.

138
00:10:32,665 –> 00:10:37,915
I think that that’s the first thing, which
strongly influenced how I look into like

139
00:10:37,945 –> 00:10:39,925
reality or futures or history as well.

140
00:10:40,315 –> 00:10:45,505
And I’m really grateful for that experience
and for, from up until today, long, very

141
00:10:45,505 –> 00:10:47,635
deep talks with my dad, which is really cool.

142
00:10:48,175 –> 00:10:53,005
On the other hand, I had a teacher
in high school from the fifth until.

143
00:10:54,205 –> 00:10:54,475
Great.

144
00:10:54,475 –> 00:10:55,735
I think it was Mr.

145
00:10:55,735 –> 00:10:56,935
Dharma rest in peace, Mr.

146
00:10:56,935 –> 00:11:00,385
Dharma, because he is suddenly and unexpectedly passed away.

147
00:11:00,475 –> 00:11:06,355
When I think I wasn’t in the ed cloud class, but he
wasn’t really a nice mentor for many of us in my class.

148
00:11:06,655 –> 00:11:13,765
And he used to teach us this saying this motto
from, for the Scouts, which is be prepared.

149
00:11:14,575 –> 00:11:17,855
I never encountered scholars actually, unfortunately, but.

150
00:11:18,655 –> 00:11:21,955
I think this be prepared thing it
started off with do your homework.

151
00:11:21,985 –> 00:11:23,725
Of course, this was the first thing.

152
00:11:23,965 –> 00:11:29,635
So when you come back from weekend and there’s Monday,
yes, there will be school and you will have had homework.

153
00:11:29,635 –> 00:11:34,015
And if you won’t have done them, there’s going to
be like, he’s going to be angry or stuff like that.

154
00:11:34,345 –> 00:11:36,895
Uh, but on the other hand, he like, he was able to teach us.

155
00:11:37,915 –> 00:11:45,175
Like the implicit effectors, what I only know today
and I’m able to be happy about or thankful about is

156
00:11:45,175 –> 00:11:52,135
that the aspects of futures generally off-course imply
that you need to be prepared for certain scenarios.

157
00:11:52,405 –> 00:11:56,935
And even if you, and going back to the examples
from earlier, if you take your umbrella to the

158
00:11:56,935 –> 00:12:00,235
evening ended, and it’s not going to rain, but
you’re happy about that fact that you carried it.

159
00:12:00,980 –> 00:12:04,790
And even if you didn’t use it, it’s cool because
you were prepared for another scenario and

160
00:12:04,790 –> 00:12:06,980
the same goes for, I don’t know, insurances.

161
00:12:08,050 –> 00:12:13,720
Sometimes there’s even this scenario of the
self-fulfilling prophecies or the self-defining prophecies.

162
00:12:14,200 –> 00:12:20,070
Another example is if you prepare for a fight with a
family member, because you think that there’s something

163
00:12:20,070 –> 00:12:25,770
that needs to be like spelled out and you expect the
other person to be like angry at you and yell at you.

164
00:12:25,770 –> 00:12:30,900
And so on, as you like anticipate the situation beforehand,
and then you go into the situation and you find it.

165
00:12:31,530 –> 00:12:33,900
He, or she is very grateful that you bring that up.

166
00:12:33,930 –> 00:12:39,180
Finally, that one topic that has been like between
you, like for years, and then you, you would have

167
00:12:39,180 –> 00:12:42,510
been prepared for, um, the fight or the argument.

168
00:12:43,020 –> 00:12:48,270
And on the other hand, maybe your way
of phrasing things already has changed

169
00:12:48,270 –> 00:12:50,280
before you even started that conversation.

170
00:12:50,520 –> 00:12:51,150
And so you learn.

171
00:12:51,895 –> 00:12:57,715
Through your own anticipation of the future, which
is great because I think in some, so many situations,

172
00:12:57,965 –> 00:13:04,435
we, as a society as well, can learn so much things
on, I think, low level in general of those situations.

173
00:13:04,465 –> 00:13:10,375
Like my, my teacher in the sixth grade told
us like to be prepared and just go through

174
00:13:10,375 –> 00:13:12,835
your mind and play through some scenarios.

175
00:13:14,100 –> 00:13:15,100
It really helped me a lot.

176
00:13:15,100 –> 00:13:21,430
And I think there were some people that
really were influential in the past six years.

177
00:13:21,700 –> 00:13:27,190
And the first thing was when I started my job
here in life, I moved to live to six years ago.

178
00:13:27,220 –> 00:13:27,340
Yeah.

179
00:13:27,340 –> 00:13:27,730
In 2000.

180
00:13:28,585 –> 00:13:35,395
Great city, by the way, because my, my today good
friend young, he was back then the chairman of a

181
00:13:35,605 –> 00:13:42,385
think tank, like a trench research think tank, and
they just started off building up a new research or

182
00:13:42,385 –> 00:13:45,895
foresight branch back then we were like 11 people or 12.

183
00:13:45,895 –> 00:13:46,885
I don’t know exactly.

184
00:13:47,215 –> 00:13:48,085
And young called me.

185
00:13:49,440 –> 00:13:52,690
I saw your profile on, I think LinkedIn or a zing or a no.

186
00:13:52,990 –> 00:13:55,120
And it says that you studied future sciences.

187
00:13:55,190 –> 00:13:55,900
This is so cool.

188
00:13:55,900 –> 00:13:56,380
Let’s talk.

189
00:13:56,770 –> 00:13:58,480
And it was like, okay, well let’s talk.

190
00:13:58,480 –> 00:14:04,090
And so we did, and it was 2014, I think when he
called me and then we met up in Berlin because of

191
00:14:04,090 –> 00:14:06,070
the fact that I was living in Potsdam and Berlin.

192
00:14:06,940 –> 00:14:11,770
Um, and we had a really nice talk and, and
had coffee afterwards, the bureau, I think,

193
00:14:12,040 –> 00:14:13,930
and it was like magic chemical chemistry.

194
00:14:14,965 –> 00:14:15,625
That’s really cool.

195
00:14:15,895 –> 00:14:22,435
Then one of my main tasks for four years was like
to build up in that think tank, the scientific

196
00:14:22,465 –> 00:14:25,525
method of the foresight team, which I did.

197
00:14:25,525 –> 00:14:33,985
And in during that time, the company grew from
those 12 peeps to over 16 quadrupled basically,

198
00:14:34,375 –> 00:14:36,595
which was not that healthy for the organization.

199
00:14:36,595 –> 00:14:37,885
And also that the friendship between.

200
00:14:38,725 –> 00:14:41,095
Some of the people they’re not involving me, of course.

201
00:14:41,545 –> 00:14:49,975
But then in 2019 I left the company and it wasn’t all
that easy to leave something brilliant like that behind.

202
00:14:50,275 –> 00:14:56,215
But on the other hand, I was looking ahead into a great
future of being self-employed and freelancing and doing

203
00:14:56,215 –> 00:15:04,905
the stuff I love and leaving out some of the organizational
stress, which always comes with being inside a corporate.

204
00:15:06,355 –> 00:15:14,665
Yeah, but still, uh, I recently met him again and we, we
had a mixture of vacation and work together in Bulgaria,

205
00:15:14,935 –> 00:15:17,815
which we did for a couple of years, at least once a year.

206
00:15:18,055 –> 00:15:25,975
And it’s like one of the, how to say that biggest
mind changing experiences I ever had in my life.

207
00:15:26,005 –> 00:15:30,625
I think until now I am I’m around 33, 34 years old.

208
00:15:30,685 –> 00:15:34,135
And I think that one human being has changed so much.

209
00:15:34,960 –> 00:15:40,270
On the one hand, like helping me to
professionalize my futuristic for future

210
00:15:40,270 –> 00:15:43,690
logistic path to something like a career.

211
00:15:43,690 –> 00:15:44,140
I don’t know.

212
00:15:44,440 –> 00:15:50,500
And as well on the other hand to find out more about myself
and to be more confident in public speaking, for example,

213
00:15:50,500 –> 00:15:53,830
or in public guitar playing because that’s our share it.

214
00:15:53,830 –> 00:15:57,310
Hobby is playing the guitar and sing along at the campfire.

215
00:15:57,765 –> 00:16:01,875
Not more, I’ve done small stages, but
that’s definitely enough, but that’s yeah.

216
00:16:01,875 –> 00:16:07,395
This interaction with this one person has changed
so much and I’m so grateful for this human being.

217
00:16:07,455 –> 00:16:07,695
Yeah.

218
00:16:07,725 –> 00:16:12,945
I think that’s, that are like the main
pillars in establishing the root to what

219
00:16:12,945 –> 00:16:14,925
I’ve what I’m now professionally speaking.

220
00:16:17,855 –> 00:16:24,395
So you talked about building up the future
parts in the scientific way in the car.

221
00:16:25,240 –> 00:16:29,980
And for most people, future and
science doesn’t seem to go along.

222
00:16:30,610 –> 00:16:35,560
Can you give us an idea of what it means
to be scientific when it comes to future?

223
00:16:36,600 –> 00:16:37,020
Sure.

224
00:16:37,290 –> 00:16:42,840
That’s actually still subject of a discussion
inside the scientific community because a

225
00:16:42,840 –> 00:16:45,030
future implies that it’s not there actually.

226
00:16:45,030 –> 00:16:50,100
And so it’s really tough to study something
which actually doesn’t exist, but that’s also

227
00:16:50,100 –> 00:16:53,490
like the one crucial thing that future science.

228
00:16:54,140 –> 00:16:55,220
Always imply.

229
00:16:55,850 –> 00:17:00,800
So we don’t really study the future and we
don’t really predict or forecast the future.

230
00:17:01,050 –> 00:17:02,180
The main title.

231
00:17:03,470 –> 00:17:11,440
In my opinion is to anticipate the future or the
images of the futures, which already exist in the

232
00:17:11,440 –> 00:17:14,860
minds of people or of cultures or of organizations.

233
00:17:15,160 –> 00:17:17,950
Let me give you like an example or our listeners.

234
00:17:18,160 –> 00:17:20,200
One example how that can look like.

235
00:17:20,440 –> 00:17:26,980
So one classic business case is to have a customer,
a client, a one company saved from the insurance

236
00:17:26,980 –> 00:17:29,530
company needs like an outlook into the future.

237
00:17:29,530 –> 00:17:29,740
And.

238
00:17:31,495 –> 00:17:34,525
And they say, dear Chi, we need this report.

239
00:17:34,555 –> 00:17:35,695
Let’s call it a trend study.

240
00:17:36,205 –> 00:17:38,785
What’s going to happen in our environment.

241
00:17:38,785 –> 00:17:42,715
So in the economic and the ecologic
and the political regulatory.

242
00:17:43,405 –> 00:17:45,445
Societal and so forth environment.

243
00:17:45,865 –> 00:17:50,335
What about those things we don’t
really have on our radar already?

244
00:17:50,605 –> 00:17:54,535
How could they change and give us like a
glimpse into the future and look in your

245
00:17:54,535 –> 00:17:57,055
glass ball, what is likely to happen?

246
00:17:57,565 –> 00:18:02,755
And then let’s talk about the things we could
already prepare in terms of like strategic

247
00:18:02,785 –> 00:18:06,445
recommendations or business model ideation sessions.

248
00:18:07,525 –> 00:18:12,805
And so I, or we, as a think tank back then we
started investigating and like for a couple

249
00:18:12,805 –> 00:18:17,065
of months, what the, actually the first step
is like to understand the business model.

250
00:18:17,215 –> 00:18:23,890
Because even like in something like the
insurance industry, Company has its own business

251
00:18:23,890 –> 00:18:30,100
model, how the revenue being generated, or how
do we talk to our clients or our customers?

252
00:18:30,130 –> 00:18:30,790
This is B2B.

253
00:18:30,790 –> 00:18:31,810
It is a B2C and so on.

254
00:18:32,210 –> 00:18:36,790
So at worst, we, we need to understand what really
is the environment of this particular client.

255
00:18:37,210 –> 00:18:38,590
And then we start investigating.

256
00:18:39,195 –> 00:18:40,005
With online research.

257
00:18:40,005 –> 00:18:41,715
Of course, we read lots of papers.

258
00:18:41,715 –> 00:18:45,945
We read lots of studies published
in magazines or scientific journals.

259
00:18:46,275 –> 00:18:51,885
And of course we have our tools like trend
radars, which tell us like a trend Google, if

260
00:18:51,885 –> 00:18:55,035
you will, what are the most important trends?

261
00:18:56,140 –> 00:19:01,870
Agglomerations basically, is that important for the
insurance industry to deal with quantum security?

262
00:19:01,870 –> 00:19:07,240
I don’t know, or with distributed ledger technology,
or is it more important to look at the European

263
00:19:07,240 –> 00:19:12,790
union regulatory level and to talk to some experts
from the polyamide or from the EU commission?

264
00:19:12,790 –> 00:19:13,120
I don’t know.

265
00:19:13,540 –> 00:19:16,600
And then the next step obviously is to talk to those people.

266
00:19:16,990 –> 00:19:20,080
So we do lots of what we call expert interviews.

267
00:19:21,340 –> 00:19:22,900
And what we ask them.

268
00:19:22,900 –> 00:19:28,540
It’s not like to read out their press papers or
things that they would publish two months later.

269
00:19:28,570 –> 00:19:36,010
Anyway, the core idea is to identify those people
who have strong influence inside a political system

270
00:19:36,010 –> 00:19:37,840
or inside the economical system, I don’t know.

271
00:19:38,020 –> 00:19:38,080
Yeah.

272
00:19:38,160 –> 00:19:48,840
Uh, or in the tech area and ask them what they would
think is doable in some years of time or what is likely.

273
00:19:50,425 –> 00:19:56,155
So there’s a very specific interrogation system
actually, because it’s not that easy to really get

274
00:19:56,155 –> 00:20:03,655
those people to talk in futures because always like
it’s natural to stick to your role inside the society.

275
00:20:03,655 –> 00:20:09,505
But if you ask like the CEO of a, of an insurance
company, how he or she, um, anticipates like the year

276
00:20:09,505 –> 00:20:15,955
2040, they will basically replicate the strategy papers
they have worked on like for the past five years.

277
00:20:16,255 –> 00:20:19,375
And you need to like instantly, okay.

278
00:20:20,305 –> 00:20:21,115
This moment.

279
00:20:22,120 –> 00:20:29,320
When they’re actually stuck in the history or in the, in
the present, because he want like something more of them.

280
00:20:29,320 –> 00:20:35,290
And nobody wants to read a study, which basically like
only summarizes all those present strategic and PR

281
00:20:35,320 –> 00:20:38,110
papers from companies that could be done by anyone.

282
00:20:38,560 –> 00:20:38,830
Yeah.

283
00:20:38,830 –> 00:20:38,990
Yeah.

284
00:20:39,040 –> 00:20:39,790
That’s the first thing.

285
00:20:39,790 –> 00:20:44,090
So we talked to experts and an expert
can also be like someone who has not.

286
00:20:44,750 –> 00:20:48,170
But the big title of CEO or lead researcher.

287
00:20:48,440 –> 00:20:53,830
But after we did this first round of questions
and interviews, we summarized internally

288
00:20:54,040 –> 00:20:56,350
for interim purposes, those findings.

289
00:20:56,830 –> 00:21:02,830
And then we pick out those things, those aspects
we had as a result, basically from the first wave.

290
00:21:03,010 –> 00:21:06,520
And then we, that we pick those
aspects that are very uncertain.

291
00:21:07,110 –> 00:21:08,910
But potentially very influential.

292
00:21:09,510 –> 00:21:15,870
So one could say like, things like wildcards inside
a branch or an industry or so things like, is it

293
00:21:15,870 –> 00:21:21,960
possible that there’s going to be an insurance
company, which basically only relies on code?

294
00:21:22,710 –> 00:21:26,160
And there is none, no employees, no employer, no CEO.

295
00:21:26,160 –> 00:21:29,190
It’s only, but nothing but code is that doable.

296
00:21:29,910 –> 00:21:35,880
And so we have this thesis and we asked the same
experts, plus an extended panel of more experts.

297
00:21:37,015 –> 00:21:37,585
What do you think?

298
00:21:38,155 –> 00:21:40,105
Is it likely, is it doable?

299
00:21:40,165 –> 00:21:45,985
And the moment we have phrased this thesis,
one of those people anyway, we’ll do it.

300
00:21:46,255 –> 00:21:50,275
So we have like a very, very high
likelihood that things are going to change.

301
00:21:51,165 –> 00:21:54,085
And at the end, I mean, it differs from project to project.

302
00:21:54,085 –> 00:21:56,745
Some clients prefer like the.

303
00:21:57,640 –> 00:22:06,040
Study like select five 50 to 80 pages of text summary, where
you basically summarize the findings of this study progress.

304
00:22:06,190 –> 00:22:13,980
Some of the clients like to publish those findings as
like a PR marketing thing in the way of like, look we are

305
00:22:13,990 –> 00:22:19,480
insurance company is very far ahead and we know the future
know they won’t tell the press that this study actually

306
00:22:19,480 –> 00:22:22,150
is from two years ago, but it’s still 10 years ahead.

307
00:22:23,230 –> 00:22:25,600
Um, the other thing can be of course, those findings.

308
00:22:26,605 –> 00:22:33,625
Summarize in an executive summary and someone makes a film
for internal purposes for the employers or the employees.

309
00:22:33,985 –> 00:22:40,855
Um, So it’s also like a little bit familiar
or a little bit next to the science fiction

310
00:22:40,855 –> 00:22:43,855
community has the same tools from time to time.

311
00:22:44,275 –> 00:22:44,455
Yeah.

312
00:22:44,485 –> 00:22:46,195
But that’s basically like the scientific approach.

313
00:22:46,195 –> 00:22:50,845
So, uh, the scientific method basically relies on that.

314
00:22:50,845 –> 00:22:55,525
You are able to like document the progress
of a study that you are able to replicate

315
00:22:55,525 –> 00:22:57,705
those stuff where it gets a little bit nifty.

316
00:22:57,715 –> 00:22:57,775
Yeah.

317
00:22:58,290 –> 00:23:05,340
Because in any social sciences is very hard to
replicate the mathematicians or biologic researchers.

318
00:23:05,370 –> 00:23:12,720
They already are like fighting the fight against the
social sciences because you can’t measure social things.

319
00:23:12,780 –> 00:23:18,780
And if I ask you the same questions today and in a
year, you will definitely give different answers.

320
00:23:19,020 –> 00:23:21,150
So it’s all about being quick.

321
00:23:22,645 –> 00:23:29,685
Taking a snapshot basically of the futures images inside
the hats of decision makers and then make up your own

322
00:23:29,695 –> 00:23:33,025
mind and, um, put that stuff into the next decade.

323
00:23:34,275 –> 00:23:35,705
And I’m real interested.

324
00:23:35,715 –> 00:23:45,045
What do you personally see as the biggest challenge in the
process applying future science in companies, organizations?

325
00:23:46,035 –> 00:23:49,305
I guess one of the biggest challenges is, or might be.

326
00:23:50,620 –> 00:23:58,540
Still today in the year 2021, living inside one
of the biggest and most complex transitions or

327
00:23:58,540 –> 00:24:01,150
transformations that humankind has ever experienced.

328
00:24:01,450 –> 00:24:05,200
Most of the companies don’t really account for change.

329
00:24:05,770 –> 00:24:08,380
They still are made or being structured.

330
00:24:09,745 –> 00:24:11,305
In terms of stability.

331
00:24:12,205 –> 00:24:18,745
I don’t know how you felt that, but in the past months,
many articles or also like events had the headline

332
00:24:18,745 –> 00:24:24,745
of resilience, we need to become more resilient,
which by the meaning of the word actually means

333
00:24:24,865 –> 00:24:28,525
being more, more stable against, outside changes.

334
00:24:28,915 –> 00:24:34,795
And it’s being used at the moment for
some kind of PR phrase to actually employ.

335
00:24:35,785 –> 00:24:43,225
That you are more agile and be more flexible to, to align
to change, but that’s actually not the meaning of the word,

336
00:24:43,375 –> 00:24:51,985
but I think the companies are most of the organizations,
including like public sector institutions have like the.

337
00:24:53,275 –> 00:25:00,565
Or part of their coat is being stable or being very
restrict against outside changes, which has one, one

338
00:25:00,595 –> 00:25:02,815
very good reason and that which is like survival.

339
00:25:03,055 –> 00:25:08,605
But on the other hand, we’ve learned over the past,
I think 20 to 30 years from very good thinkers.

340
00:25:09,415 –> 00:25:17,275
That being agile in an organizational sense is actually
the number one survival factor in the 21st century.

341
00:25:18,955 –> 00:25:27,715
Also, and it’s this, it leads into an organizational’s
schizophrenia that also like this, like aligning to

342
00:25:27,715 –> 00:25:33,145
an agile workforce or a labor market also like it’s
one of those changes from the outside, which has to

343
00:25:33,145 –> 00:25:40,675
be like fought against and that’s wherever I, where I
came to the conclusion a couple of years ago that these

344
00:25:40,675 –> 00:25:44,455
2020s, this current decade is going to be the next.

345
00:25:45,475 –> 00:25:50,935
Which most likely will be remembered as the decade
where most of the companies and organizations.

346
00:25:51,730 –> 00:25:55,360
And they died because they weren’t prepared for change.

347
00:25:55,690 –> 00:26:01,630
And that’s actually very tragic because there are so
many people involved in working for those companies.

348
00:26:01,630 –> 00:26:08,830
Especially in the central Europe, we have such a strong
middle class of companies and there are so many like jobs.

349
00:26:09,405 –> 00:26:13,875
Involved in those companies and they are
really doing a bad job in coping with change.

350
00:26:14,235 –> 00:26:20,115
And we’ve known that for a couple of years and
this pandemic as like a multi crisis, basically.

351
00:26:20,685 –> 00:26:20,895
Yeah.

352
00:26:20,925 –> 00:26:23,595
Show that nothing was prepared.

353
00:26:23,715 –> 00:26:29,685
Although so many people from the scientific community
to NGOs, I don’t know, to the society warrants.

354
00:26:30,550 –> 00:26:34,270
That somethings are coming like a pandemic.

355
00:26:34,390 –> 00:26:35,800
And I wasn’t really surprised.

356
00:26:35,800 –> 00:26:36,130
I don’t know.

357
00:26:36,460 –> 00:26:41,500
I think from the futures community, no one really
was surprised that there is no this pandemic.

358
00:26:41,710 –> 00:26:43,150
And we were like, okay, let’s keep going.

359
00:26:43,150 –> 00:26:46,300
And then a month later we were
like, okay, where are the plants?

360
00:26:46,980 –> 00:26:48,660
Oh yeah, you have some plans, right?

361
00:26:48,900 –> 00:26:50,130
Oh, you don’t fuck.

362
00:26:50,550 –> 00:26:51,840
So I was like, damn it.

363
00:26:52,350 –> 00:26:52,620
Yeah.

364
00:26:52,680 –> 00:26:53,970
So I think that’s my point.

365
00:26:54,880 –> 00:26:59,260
Or my 2 cents to that discussion,
the companies are lost basically.

366
00:26:59,560 –> 00:27:07,510
And I talked to so many big corporations and even,
uh, the C level managers, they know that they know

367
00:27:07,510 –> 00:27:11,110
that they are unable to change, to transform from.

368
00:27:11,935 –> 00:27:20,035
From a partially century li as a century
long company to an agile network of, I don’t

369
00:27:20,035 –> 00:27:23,395
know, teams and bubbles and Holacracy stuff.

370
00:27:23,425 –> 00:27:29,935
That’s just not doable, especially for highly
regulated things like insurance, like financial

371
00:27:29,935 –> 00:27:32,605
industry, like mobility have automotive and stuff.

372
00:27:32,635 –> 00:27:34,315
It’s just too much.

373
00:27:34,345 –> 00:27:35,575
And you’re the new players.

374
00:27:36,455 –> 00:27:39,575
In the field, let’s say Tesla,
or I don’t know, trade Republic.

375
00:27:39,965 –> 00:27:44,945
They are agile because they started off from
a playing field and that’s totally different.

376
00:27:46,165 –> 00:27:47,155
I fully agree with you.

377
00:27:47,185 –> 00:27:53,515
So what came up just now is that one of my friends who
was used to be a, until recently in a C-suite position,

378
00:27:54,235 –> 00:28:01,855
in a big Austrian company and very successful, got
multiple awards for his position in, in his industry.

379
00:28:01,855 –> 00:28:03,835
I don’t want to say more because I don’t want to uncouple.

380
00:28:03,835 –> 00:28:08,505
It is of course, but he actually made
the company Agio and he was kicked out.

381
00:28:09,640 –> 00:28:16,930
So that’s, it’s very interesting because he was super
successful and it’s a very conservative company.

382
00:28:17,830 –> 00:28:22,900
And actually during the time where you would need
someone like this person to be come the second, most

383
00:28:22,900 –> 00:28:25,690
important person next to the CEO, the COO kick them out.

384
00:28:25,720 –> 00:28:30,100
And I see this a lot and I’m not exactly
like this, but I see a lot of what you see.

385
00:28:30,805 –> 00:28:33,805
That a lot of companies will not survive.

386
00:28:34,555 –> 00:28:39,145
And a lot of companies try to be stable
and resilient in the way that you say it.

387
00:28:39,565 –> 00:28:46,105
And what comes up for me when I think of that is
basically, I always like many people use the metaphor of

388
00:28:46,105 –> 00:28:49,465
the caterpillar becoming a butterfly, the metamorphic.

389
00:28:50,550 –> 00:28:56,940
And like this idea of being be being stable and
more resilient is when actually the metamorphoses

390
00:28:56,940 –> 00:29:05,040
is already going on and the cells on the cocoon and
they decide they want to be stable and they want to,

391
00:29:05,460 –> 00:29:09,660
they want to hold on and go back to the old normal.

392
00:29:10,110 –> 00:29:15,720
What happens is exactly that the die, because you
need to be at agile, let’s use this work now, also

393
00:29:15,720 –> 00:29:18,930
in this metaphor and actually become the new thing.

394
00:29:20,435 –> 00:29:23,585
With the essence of the past, but become the new thing.

395
00:29:24,155 –> 00:29:29,765
So first of all, they become the unfold,
their potential, and also to survive.

396
00:29:30,185 –> 00:29:36,725
And the reality is that out of 400
eggs of butterflies, only about eight.

397
00:29:37,250 –> 00:29:40,250
Survive and become butterflies again.

398
00:29:40,310 –> 00:29:45,440
So it’s a 2%, and I wouldn’t say that
only 2% of the companies will survive now.

399
00:29:45,470 –> 00:29:46,280
Definitely not.

400
00:29:46,310 –> 00:29:50,510
Because when we talk about X, not all of these
X become actually caterpillars of course.

401
00:29:51,260 –> 00:29:57,140
But yes, there will be a lot of companies who
really hold on and hold back because of fear

402
00:29:57,170 –> 00:30:05,150
because of the intention of security, the value
of security, and also because of being delivered.

403
00:30:06,610 –> 00:30:08,650
That they will a lot will not survive.

404
00:30:08,950 –> 00:30:11,680
So basically, yeah, that’s my 2 cents at the moment.

405
00:30:11,770 –> 00:30:14,200
And what I always say, and we also want to add that.

406
00:30:14,200 –> 00:30:20,620
I think that we globally on a kind of a state of
psychosis and, and the positive thing about so

407
00:30:20,620 –> 00:30:26,710
courses that most people don’t actually know is
that psychosis, if people go through, so courses

408
00:30:26,710 –> 00:30:30,250
in a healthy way, they’re actually exactly happens.

409
00:30:30,250 –> 00:30:33,760
The same thing that you have with
butterflies and caterpillars, they actually.

410
00:30:34,275 –> 00:30:40,305
Come out on the other end in a very different way and not
normal, again, sort of state, but on a different level.

411
00:30:40,365 –> 00:30:41,745
So most people don’t actually know that.

412
00:30:42,555 –> 00:30:44,685
And, uh, that’s why I like to compare it.

413
00:30:47,255 –> 00:30:54,705
I’m really interested what you see that the
pandemic means for us as societies, as a humanity.

414
00:30:55,255 –> 00:30:59,875
I hope I hope actually, that’s not
based on scientific research now.

415
00:31:00,175 –> 00:31:02,125
I hope that society’s able to.

416
00:31:02,965 –> 00:31:12,715
Really step back a little and analyze the situation because
as you just said, psychosis can lead to a better state as

417
00:31:12,715 –> 00:31:20,395
well as horrific car accidents or so can, but they can also
lead to a trauma if I don’t really deal with a situation

418
00:31:20,515 –> 00:31:26,755
and that’s the work to be done that I think that’s,
that’s not the most preferred way most people like to act.

419
00:31:27,025 –> 00:31:30,565
And now we have the same thing on
the table for us as a collective.

420
00:31:31,315 –> 00:31:34,915
Intelligence like the collective mind now needs to learn.

421
00:31:35,185 –> 00:31:42,865
And it also needs to, it needs to heal on the other
hand, because I think what the many micro accidents or

422
00:31:42,865 –> 00:31:50,125
the micro shocks and traumatized, which happened during
the pandemic that don’t get to be on the headlines

423
00:31:50,175 –> 00:31:52,705
of a big paper, newspaper, or so are the blocks.

424
00:31:53,455 –> 00:32:03,370
Those are the things that really nourish my sorrows in
terms of like, Post COVID scenarios for me often are

425
00:32:03,370 –> 00:32:08,650
based on an insecure, still very uncertain society.

426
00:32:09,040 –> 00:32:13,450
And at least back in history always have been,
societies were very insecure and very unstable.

427
00:32:13,690 –> 00:32:20,050
And even the elites, the, like the established
community of political leaders, so, uh, were insecure.

428
00:32:20,320 –> 00:32:20,770
What happened.

429
00:32:21,835 –> 00:32:27,175
We all know that a hundred years ago or so we
had the situation just after the first world war.

430
00:32:27,175 –> 00:32:30,955
Then we had the second world war
in highly insecure societies.

431
00:32:30,955 –> 00:32:36,715
And with verified, like under the bottom, like
under the surface between elites like conservative

432
00:32:36,715 –> 00:32:40,375
and liberal and progressive, I don’t know the
scientific community, the religious community yeah.

433
00:32:40,395 –> 00:32:40,875
And stuff.

434
00:32:41,685 –> 00:32:44,175
And that led to people being.

435
00:32:45,370 –> 00:32:46,270
Not really happy.

436
00:32:46,780 –> 00:32:51,640
And then they listened to those demagogues, like
Hitler was one, or I don’t know that they go to war.

437
00:32:51,670 –> 00:32:56,530
They sent their children literally to go to war
and they know that they’re not going to come back.

438
00:32:56,770 –> 00:33:02,650
And I think this is like the ultimate state of
psychosis, like a collective psychosis, maybe.

439
00:33:03,955 –> 00:33:08,125
So desperate and also like economic sectors
aside, but you are so desperate that

440
00:33:08,125 –> 00:33:11,365
you’re, you’re ready to like, um, yeah.

441
00:33:11,395 –> 00:33:17,895
Send your kids to war or participate in war against some
enemy like virtual and real enemy in most of the cases.

442
00:33:18,105 –> 00:33:22,785
And we’ve seen that in the past years, this year
as well, also in Israel, again, that we had.

443
00:33:23,720 –> 00:33:24,260
Conflict.

444
00:33:24,260 –> 00:33:31,160
We had so many satellite conflicts, which like for me,
show that even like the cold war never really ended.

445
00:33:31,250 –> 00:33:36,920
It’s just not up on the newspapers since
1989, because someone, some actually.

446
00:33:38,095 –> 00:33:41,935
Smart scientists said that this is
going to be like the end of history.

447
00:33:42,595 –> 00:33:45,835
I think the end of the clash of
two systems, that’s ridiculous.

448
00:33:45,835 –> 00:33:49,225
I mean, a scene from now just nothing changes.

449
00:33:49,225 –> 00:33:51,295
It was just the conflict lines changed.

450
00:33:51,565 –> 00:33:53,185
But getting back to the topic, I think.

451
00:33:54,625 –> 00:34:01,735
It’s a very dark scenario in the future, and that we won’t
be able to learn from this shock, this collective shock.

452
00:34:01,735 –> 00:34:06,865
And I think they’re the most, well, one of the most
important things, I mean, of course I’m a future scientist,

453
00:34:06,925 –> 00:34:16,255
but one of the most important things we need to anticipate
at least, or, uh, Out loud more often is the fact of

454
00:34:16,525 –> 00:34:26,245
times thinking, because I think still, we, at least in the
global north, we have made such huge progress in the past

455
00:34:26,245 –> 00:34:30,115
2200 or 2,250 years, but also like very many downsides.

456
00:34:30,115 –> 00:34:36,805
I know we all know that in terms of sustainability
and societal justice, but on the other hand,

457
00:34:36,835 –> 00:34:42,520
we’ve made such a huge progress in terms
of like, At least good functioning laws.

458
00:34:42,550 –> 00:34:47,200
We have the police, we have politicians, we have
corporations who are able to deliver so much

459
00:34:47,200 –> 00:34:51,730
stuff into our supermarkets that we can simply
everyday goal there between seven in the morning.

460
00:34:51,730 –> 00:34:57,640
And I don’t know, 10 in the evening and just buy stuff
from all over the world, which is actually great.

461
00:34:57,670 –> 00:35:03,370
But so many people, I don’t really realize
that this is something to be grateful for.

462
00:35:04,090 –> 00:35:10,840
And so if something changes in the daily
routines, they, they are unhappy and then

463
00:35:10,840 –> 00:35:12,820
they’re angry and that they yell at each other.

464
00:35:12,820 –> 00:35:18,490
And that they say that those migrants are, are subject
to our policy because it’s just nuts that they come to

465
00:35:18,490 –> 00:35:21,310
our country and want to steal our jobs and our wives.

466
00:35:21,310 –> 00:35:28,090
And I don’t know, like these never, never nourish on
the ground of unhappy people and ungrateful people.

467
00:35:28,090 –> 00:35:30,820
And I think this is really closely connect.

468
00:35:31,960 –> 00:35:40,540
To thinking times differently and thereby,
I mean, this whole thing we call history

469
00:35:40,870 –> 00:35:43,750
is not a linear story we can tell.

470
00:35:43,930 –> 00:35:46,750
And it’s also not an greetings
to my history teacher back then.

471
00:35:47,000 –> 00:35:49,750
It’s also not something that repeats all over again.

472
00:35:49,960 –> 00:35:52,030
Of course you can always compare things to each other.

473
00:35:52,030 –> 00:35:54,520
You can even compare bananas and
apples, and I think that’s cool.

474
00:35:55,525 –> 00:35:59,875
I think we all need to learn and we need to
understand better the spiral dynamics and the

475
00:36:00,595 –> 00:36:07,795
times like the levels we are leveraging in the
past decades, because what we are heading towards

476
00:36:08,095 –> 00:36:10,945
possibly can be something very cool for everyone.

477
00:36:11,335 –> 00:36:16,645
Even if we are going to be like around 10 billion
people in the world and if a few decades, and

478
00:36:16,705 –> 00:36:21,865
some people still believe that we are not able
to feed even the people we have right now.

479
00:36:22,225 –> 00:36:23,235
No, that is something.

480
00:36:23,875 –> 00:36:26,515
W we, we need to discuss definitely publicly.

481
00:36:27,700 –> 00:36:34,930
It’s all about justice and it’s all about what we
value as, as a being and not just human beings.

482
00:36:34,930 –> 00:36:41,200
It starts with human beings and we need to try to,
in my opinion, to make it affordable for every person

483
00:36:41,200 –> 00:36:48,400
in the world to have access to water, to food, to
schools, to clinics, health systems and stuff like that.

484
00:36:48,670 –> 00:36:50,080
But it’s not that easy, obviously.

485
00:36:50,350 –> 00:36:54,460
And on the other hand, we’ve done, we don’t
where we, when we must not forget to grab.

486
00:36:55,300 –> 00:37:02,140
Certain planetary beings like the planet itself, mother
earth, which nourishes us, which keeps us alive basically,

487
00:37:02,140 –> 00:37:08,170
which is part of the universe with all of the sudden we
would have, we would totally be banked with other moon.

488
00:37:08,200 –> 00:37:14,230
We would have some problems, uh,
without forests, clean air Burt’s bees.

489
00:37:14,320 –> 00:37:15,130
I don’t stuff like that.

490
00:37:15,310 –> 00:37:18,160
It’s so big and so complex, but we should grant those.

491
00:37:19,105 –> 00:37:26,755
Planetary beings rights, just like it happened a few months
ago in New Zealand when rivers became justice persons,

492
00:37:27,115 –> 00:37:30,745
which was just a very big effort in, for climate activists.

493
00:37:30,895 –> 00:37:33,685
But New Zealand is not really like
the biggest country in the world.

494
00:37:33,895 –> 00:37:40,695
It’s good that things are starting to change,
but as long as it’s not forbidden, To throw

495
00:37:40,695 –> 00:37:47,055
plastic into the rivers to put like chemistry
waste, like BAS F or a buyer or stuff like that.

496
00:37:47,265 –> 00:37:53,835
Companies like that they’re doing all the time on
a Nestle is still purchasing water sources in those

497
00:37:53,865 –> 00:38:00,495
regions where most of the people are basically dying
from, from to less water and operations like that.

498
00:38:00,555 –> 00:38:03,945
They just don’t get the idea of wholeness.

499
00:38:05,060 –> 00:38:06,070
Society humankind.

500
00:38:06,550 –> 00:38:12,160
And I think that’s one of the things we need to address
more often and not just in the activist way, because at

501
00:38:12,160 –> 00:38:15,430
least for me, that’s not, um, the right way of phrasing it.

502
00:38:15,670 –> 00:38:24,670
But like in a way that really can change things or address
like regulatory changes that need to be taken so that our

503
00:38:24,730 –> 00:38:28,660
children or grandchildren generation will have the plan.

504
00:38:29,485 –> 00:38:35,455
Whereas like nice to live on and not just where
everyone’s in this state of a refugee all the

505
00:38:35,455 –> 00:38:43,615
time and the climate catastrophe, basically
just choice or the ground for plans and things.

506
00:38:44,935 –> 00:38:50,745
Lift from that’s something I call the Levier seam.

507
00:38:51,285 –> 00:38:54,195
It’s like translate into English.

508
00:38:54,395 –> 00:39:01,005
So Olivia tan or Leviathan, I don’t know this
ancient creature, which was like, then in the 17th

509
00:39:01,005 –> 00:39:08,555
century used by Thomas hops to describe the, this
silver entity or the, uh, the dictator is that.

510
00:39:09,250 –> 00:39:10,210
Busy or the monarchy.

511
00:39:10,660 –> 00:39:16,570
And he wanted to like picture like this
destructive force with this ancient figure.

512
00:39:16,840 –> 00:39:23,920
And I think the other term, which is being used for the
scientific term for climate change or this age we are living

513
00:39:23,920 –> 00:39:31,090
in right now is the Anthropocene, which implies that humans
are responsible for climate change and like geological

514
00:39:31,360 –> 00:39:38,470
change, which is already undergo since some centuries, not
even starting with the industrialization, but some centers.

515
00:39:39,805 –> 00:39:45,985
But I don’t agree with that idea because it implies that
every human being that the human nature implies that we

516
00:39:45,985 –> 00:39:51,235
like to destroy nature and that we like to change the
planetary boundaries, which is, I think not the case.

517
00:39:51,525 –> 00:39:58,545
If you would do a poll like conduct random interviews
with people on the street, or just type in some

518
00:39:58,545 –> 00:40:02,775
numbers in your smart phone and call someone and
you ask all those people the same question, would

519
00:40:02,775 –> 00:40:04,695
you agree that destroying the planet is cool?

520
00:40:06,565 –> 00:40:08,905
Barely everyone would agree.

521
00:40:09,145 –> 00:40:14,515
That’s a very shitty idea, but still in organizations
and organizations have to like those boundaries

522
00:40:14,515 –> 00:40:17,725
and constraints set by regulatory rules.

523
00:40:17,785 –> 00:40:22,585
But the society, essentially those organizations
are responsible for those changes and they can’t

524
00:40:22,585 –> 00:40:25,075
just simply stop doing what they’re doing it.

525
00:40:25,105 –> 00:40:25,945
I understand that.

526
00:40:26,155 –> 00:40:28,225
So we need to anticipate that as well.

527
00:40:28,285 –> 00:40:29,875
If we build our future scenarios.

528
00:40:30,295 –> 00:40:30,475
Yeah.

529
00:40:30,505 –> 00:40:33,595
Long story short, I think rethinking times.

530
00:40:34,660 –> 00:40:41,380
Priorities in terms of like, what is really, really
essential to surviving and like living moral than

531
00:40:41,380 –> 00:40:44,680
to survive is very crucial for our generations.

532
00:40:47,785 –> 00:40:53,365
So we’re basically now at the core of what this
podcast is about, which paradigms do you think

533
00:40:53,395 –> 00:40:56,245
need to be challenged for a better future?

534
00:40:56,425 –> 00:41:03,315
First, I think that’s the first principle or
paradigm is that there is not going to be an

535
00:41:03,375 –> 00:41:09,705
artificial intelligence that is able to rescue us
in our situation or to do MOUs, at least not yet.

536
00:41:10,420 –> 00:41:15,610
Couple of centuries, because I have a strong
feeling that whenever I go to an event or a write

537
00:41:15,640 –> 00:41:20,440
an article or something about we’re involving
artificial intelligence systems, which is.

538
00:41:21,505 –> 00:41:28,525
Machine or deep learning systems people somehow
expect, especially from non it industry.

539
00:41:28,525 –> 00:41:34,975
Of course, they expect me to say, or to tell them that
someday there’s going to be president or a chancellor,

540
00:41:34,975 –> 00:41:41,665
or like everything’s going to be automated by an AI
because there’s the superhero intelligence approaching us.

541
00:41:41,875 –> 00:41:46,615
And after spending really many days or weeks
or months, I don’t know, studying this topic

542
00:41:46,795 –> 00:41:49,015
and talking to many experts in the field of AI.

543
00:41:49,015 –> 00:41:50,175
And it, I, I.

544
00:41:50,920 –> 00:42:01,380
Like confidence in those scenarios because, um,
even if we can somehow simulate human brains by

545
00:42:01,590 –> 00:42:06,270
the computers, by the speed of the transistors
and the CPU and stuff like that, even then.

546
00:42:06,270 –> 00:42:06,330
Yeah.

547
00:42:07,165 –> 00:42:12,835
It’s absolutely very unlikely that
those machines are able to be conscious.

548
00:42:13,765 –> 00:42:19,345
I think I won’t go more into detail, but having
said that this is one of the big paradigms where

549
00:42:19,405 –> 00:42:21,505
I think many industries or decision makers.

550
00:42:22,625 –> 00:42:31,505
And also normal private persons expect the progress of
technology to somehow land it’s like B it’s necessarily

551
00:42:31,505 –> 00:42:37,505
like a necessary output that we’re going to have a super
intelligent AI agent that rescues or even dooms something.

552
00:42:37,715 –> 00:42:45,065
And it either leads to a state of, I don’t
know, like waiting for what’s it called?

553
00:42:45,065 –> 00:42:46,475
Love Jesus, but like the big.

554
00:42:47,650 –> 00:42:48,970
They’re like the big Messiah, right?

555
00:42:49,290 –> 00:42:56,170
So just waiting and it’s going to be cool someday, so we can
still pollute the environment or we can still go to a nine

556
00:42:56,170 –> 00:43:01,720
to five jobs and do things just as we did before, as long
as I don’t need to change, everything’s going to be cool.

557
00:43:01,750 –> 00:43:04,300
So this is the one big fraction
and the other big fraction is.

558
00:43:06,135 –> 00:43:06,765
For one moment.

559
00:43:07,275 –> 00:43:07,425
Yes.

560
00:43:07,425 –> 00:43:09,105
Because it’s interesting what you’re saying.

561
00:43:09,105 –> 00:43:15,675
And I agree with the second part, definitely that,
I mean, we have responsibility and we cannot hope

562
00:43:15,675 –> 00:43:18,195
and wait for artificial intelligence or barriers.

563
00:43:18,195 –> 00:43:22,965
Artificial intelligence is so also, I mean,
there’s already multiple artificial intelligence.

564
00:43:23,800 –> 00:43:26,290
That we cannot wait and hope that they will make it better.

565
00:43:26,350 –> 00:43:29,560
So I think that’s completely the wrong approach.

566
00:43:29,560 –> 00:43:36,850
And also from my research, I, what I realized is
that, that, uh, they actually going to be like us.

567
00:43:36,850 –> 00:43:39,310
They’re not going to be better
than us because we teach them.

568
00:43:39,430 –> 00:43:44,260
So, I mean, that’s the second problem, but when it comes
to the first thing you said about artificial intelligence,

569
00:43:44,590 –> 00:43:47,950
I like the people I talk to also on this podcast.

570
00:43:48,500 –> 00:43:51,140
They actually do agree on that.

571
00:43:51,200 –> 00:43:53,150
It is around not central.

572
00:43:54,730 –> 00:44:00,880
And that also both of, I had two people on this
podcast, both professors in the U S and artificial

573
00:44:00,880 –> 00:44:06,520
intelligence, and both of them actually say artificial
intelligence does already have consciousness.

574
00:44:06,880 –> 00:44:07,060
Okay.

575
00:44:07,080 –> 00:44:14,680
And also, also one of them, the guy who is also a musician
and two is actually one of the people who was off.

576
00:44:15,300 –> 00:44:18,900
Like very crucial influential opine use of machine learning.

577
00:44:19,320 –> 00:44:26,790
He explains that the shift from old-school AI at its
core this way and, and what is now machine learning

578
00:44:26,790 –> 00:44:29,190
actually came through the paradigm of creativity.

579
00:44:29,490 –> 00:44:35,760
So there’s actually this idea that AI cannot
be creative, but it seems to be the fact, but

580
00:44:35,760 –> 00:44:40,080
there’s, as you say, you talk to experts, I talked
to experts and then there’s different opinions.

581
00:44:40,080 –> 00:44:43,020
Of course, I’m not an expert when
it comes to artificial intelligence.

582
00:44:43,615 –> 00:44:47,755
Make my research like you, and of
course people say very different things.

583
00:44:47,755 –> 00:44:55,105
So I think for me, I was very surprised when they actually,
I wasn’t so much surprised about the creativity part, but

584
00:44:55,105 –> 00:44:59,725
I was really surprised about the conscious part that they
say, yeah, artificial intelligence is already conscious,

585
00:44:59,965 –> 00:45:05,695
but they’re very different conscious, very different
consciousness, maybe to what we compare consciousness now.

586
00:45:05,725 –> 00:45:11,575
But then the question is once artificial intelligence
becomes a super intelligence, general artificial Intel.

587
00:45:13,120 –> 00:45:17,530
What then is with the consciousness that I
just wanted to add that in a way that there’s

588
00:45:17,530 –> 00:45:19,360
different opinions when it comes to this point.

589
00:45:19,420 –> 00:45:28,180
But definitely what I agree with you is that we cannot hope
that we can just continue polluting, uh, the environment and

590
00:45:28,180 –> 00:45:30,830
doing all this stuff and they will fix it for us anyways.

591
00:45:30,830 –> 00:45:32,350
So yeah, definitely.

592
00:45:33,475 –> 00:45:34,555
Yeah, exactly.

593
00:45:34,645 –> 00:45:36,565
Someone needs to get it done.

594
00:45:36,835 –> 00:45:37,025
Yeah.

595
00:45:37,045 –> 00:45:38,245
But I totally agree with you.

596
00:45:38,455 –> 00:45:44,305
And they are, of course, the positions that a bit
emphasized that things we actually can’t really

597
00:45:44,305 –> 00:45:49,615
explain sometimes appear to be like consciousness
or creativity or I don’t know, stuff like that.

598
00:45:49,795 –> 00:45:56,605
I think the point is that what comes into the minds of
people and the society, let it be through science fiction.

599
00:45:57,310 –> 00:45:59,650
Through the daily news or YouTube videos.

600
00:45:59,650 –> 00:46:00,040
I don’t know.

601
00:46:00,370 –> 00:46:02,350
Then, then we have these two positions.

602
00:46:02,350 –> 00:46:03,940
I was just trying to explain.

603
00:46:03,940 –> 00:46:09,370
So the first one is like waiting and wishing and
the second one, and let’s say just two of many

604
00:46:09,400 –> 00:46:13,870
positions, but those are, I think the most contrary
positions and the other one is like being afraid.

605
00:46:14,410 –> 00:46:17,840
And especially in Germany, we have, I think, more.

606
00:46:18,445 –> 00:46:23,935
Oh, conservatism compared to some other neighboring
countries in Europe for a couple of reasons.

607
00:46:24,145 –> 00:46:29,395
But the one thing is that this matches pretty,
perfectly being afraid of AI, because we all

608
00:46:29,395 –> 00:46:34,555
know these scientists, the science fiction movies
that are lost many of them, but in many of these

609
00:46:34,615 –> 00:46:38,575
narratives, the AI somehow, or someday turns bad.

610
00:46:38,875 –> 00:46:43,475
But to be honest, I’m more afraid of
some north Korean dictator who has never.

611
00:46:44,365 –> 00:46:45,565
AI weapons.

612
00:46:45,745 –> 00:46:50,485
And who’s able to sense, I don’t know,
south Korean or Chinese, or even Austrian

613
00:46:50,485 –> 00:46:52,885
or German people and simply kill them.

614
00:46:52,975 –> 00:46:55,045
That’s doable and that’s not intelligent.

615
00:46:55,045 –> 00:47:00,115
That’s just like a copy of some
totally weird person’s aims and goals.

616
00:47:00,535 –> 00:47:02,575
Maybe I can recommend a cool book.

617
00:47:02,575 –> 00:47:05,605
I would strongly recommend in this topic from Mexico.

618
00:47:06,910 –> 00:47:08,740
He wrote the book life three own.

619
00:47:09,370 –> 00:47:14,230
And there he has all these positions, like
in a matrix and talks about these positions.

620
00:47:14,470 –> 00:47:22,210
And I think from what I remember from the book, he
tries not to position himself in this very matrix.

621
00:47:22,660 –> 00:47:23,820
It’s more about getting to know.

622
00:47:24,520 –> 00:47:31,360
That are out there, but I think somehow, maybe
he influenced my, my expectancy or my view on

623
00:47:31,360 –> 00:47:34,870
how AI might someday and it’s going to happen.

624
00:47:35,020 –> 00:47:40,060
That’s that’s undoubtedly, but I think
that we have more rather centuries than

625
00:47:40,240 –> 00:47:42,430
years or decades until we are at the time.

626
00:47:43,360 –> 00:47:44,860
Of course, it could be like this.

627
00:47:44,860 –> 00:47:52,180
I just, that, that sort, it’s interesting that in the
AI scene, there are so many different opinions, so

628
00:47:52,190 –> 00:47:55,390
there’s really a spectrum and this is the one thing.

629
00:47:55,390 –> 00:47:56,230
And of course, yeah.

630
00:47:57,415 –> 00:48:01,975
When we look at the whole scene now, of course,
there’s people also who say things because of

631
00:48:01,975 –> 00:48:07,705
ideology or some people will say things because
it makes them become more interesting because they

632
00:48:07,705 –> 00:48:09,655
get more funding, you know, and so on and so forth.

633
00:48:09,925 –> 00:48:11,275
It’s really hard to tell also.

634
00:48:11,335 –> 00:48:11,515
Yeah.

635
00:48:12,025 –> 00:48:16,675
So there is, there’s a lot of people in Silicon
valley, they say it’s around in 20 years.

636
00:48:16,705 –> 00:48:21,895
So I mean, we most likely will be around to see,
and it will be exciting to see if this happens.

637
00:48:23,120 –> 00:48:30,260
Yet, and this is exactly, I think what you said is we
really cannot hope that the Messiah comes since effects.

638
00:48:31,510 –> 00:48:31,960
Totally.

639
00:48:32,410 –> 00:48:38,380
And if I may add just one thing, I think speaking
for myself in my own interests, I need to at least

640
00:48:38,470 –> 00:48:45,340
keep this expectancy up, I think until 2030, because
one of my current projects was going to be published

641
00:48:45,430 –> 00:48:48,130
around the end of the year is dealing executives.

642
00:48:48,805 –> 00:48:54,205
The labor market or the workforce and the
influence of AI in different industries.

643
00:48:54,205 –> 00:48:59,065
So for example, mobility or industries,
different industries, and this is going to be

644
00:48:59,095 –> 00:49:00,745
like published in the, at the end of the year.

645
00:49:00,745 –> 00:49:06,295
And we work with a pretty narrow definition of AI,
which is also being used by the European commission.

646
00:49:06,595 –> 00:49:11,095
And we also have the German center for AI research on board.

647
00:49:11,770 –> 00:49:14,530
D F K I and they share this definition.

648
00:49:14,530 –> 00:49:20,200
And I think at least until the 20, 30 lb
around for underlining the narrow expectancy.

649
00:49:22,180 –> 00:49:28,870
So, yeah, you were also saying something you wanted
to bring up a different topic as we paradigm wise.

650
00:49:29,460 –> 00:49:34,230
I think still I mentioned it before, but
it’s so important that we need to go into

651
00:49:34,230 –> 00:49:36,180
detail or at least underline it again.

652
00:49:36,360 –> 00:49:41,640
Thinking in times differently also
has one, one dimension of mindfulness.

653
00:49:41,640 –> 00:49:46,020
I think so we, as a collective, we experienced now.

654
00:49:47,560 –> 00:49:54,760
Very differently in different cultures or millionaires or
continents even, or as someone more digital or less digital.

655
00:49:55,120 –> 00:49:58,930
If you have like a news ticker on your smartphone
and you’re being like bombarded with all these

656
00:49:58,930 –> 00:50:05,020
facts about Corona or COVID-19 all the time, it’s
a very different perception as someone who like.

657
00:50:06,155 –> 00:50:07,495
The daily newspaper maybe.

658
00:50:07,795 –> 00:50:11,515
And sometimes he watches the television news at 8:00 PM.

659
00:50:11,785 –> 00:50:12,955
It’s going to be like very different.

660
00:50:13,225 –> 00:50:19,045
And it made me think of people who are constantly writing
comments or hate speech on Twitter and stuff like that.

661
00:50:19,285 –> 00:50:25,295
So the, the ranges are high, but
scientists suggest already that we need to.

662
00:50:26,095 –> 00:50:29,965
Use or maybe even emphasize the loss of pace.

663
00:50:30,205 –> 00:50:37,435
So the deceleration basically, which was caused by lockdowns
or shutdowns or stuff like that, because it’s sometimes

664
00:50:38,185 –> 00:50:45,835
even for someone like me who is actually, I believe very
fast minded, and sometimes I am not able to speak as

665
00:50:45,835 –> 00:50:54,535
fast as I think many people have that issue, but my time
I, or other put other way, when I think of the past.

666
00:50:56,470 –> 00:50:58,180
About just how many things happened.

667
00:50:58,180 –> 00:51:03,400
How many paradigms for myself have maybe
changed or at least nuances I’ve been realigned?

668
00:51:03,500 –> 00:51:08,200
It’s not just this year in particular, but like if
for every year I think I experienced so many things

669
00:51:08,200 –> 00:51:11,970
because I learned very, at a young age, how to be.

670
00:51:12,715 –> 00:51:16,405
More self-aware how to be more mindful with situations.

671
00:51:16,405 –> 00:51:19,615
I embraced practicing at least a little bit of yoga.

672
00:51:19,915 –> 00:51:26,065
Very unregular to be honest, when I was around
20 years old and I never got into that thinking

673
00:51:26,095 –> 00:51:33,775
of like perceiving yoga or alternative approaches
to workouts or breathing techniques, or so as

674
00:51:33,775 –> 00:51:37,495
being something like is a Terek spiritual, I just.

675
00:51:38,260 –> 00:51:38,950
On my own.

676
00:51:38,980 –> 00:51:39,790
It’s cool.

677
00:51:39,790 –> 00:51:40,390
It works.

678
00:51:40,630 –> 00:51:43,990
I’m around 1, 1, 180 5 tall.

679
00:51:44,110 –> 00:51:49,540
So that’s not too tall, but the back pain started
very early for me and also knee pain because I

680
00:51:49,630 –> 00:51:51,700
like to do sports, but also very unregulated.

681
00:51:52,750 –> 00:51:56,020
So I had these physical traits and
I thought yoga is a nice thing.

682
00:51:56,020 –> 00:52:02,320
So I took that and practiced it, like whenever it
works and it’s cool and it strengthened my body.

683
00:52:02,530 –> 00:52:05,350
And then I dove deeper into that
breathing techniques, stuff.

684
00:52:05,500 –> 00:52:12,520
I don’t know the names of that, but when I’m about to go
on a stage like this moment, you need to do a short time.

685
00:52:12,550 –> 00:52:16,900
You can sit there in the audience or backstage and
like the moderator or the anchorman is prepared.

686
00:52:17,830 –> 00:52:21,400
Your speech or your like announcing.

687
00:52:22,190 –> 00:52:22,650
Exactly.

688
00:52:22,890 –> 00:52:23,610
And you’ll listen to that.

689
00:52:23,610 –> 00:52:28,920
And a lot of people like get nervous then, but
it’s all a matter of preparation of course, on

690
00:52:28,920 –> 00:52:35,820
the one hand and then this breathing techniques
kick in and you just like do that and you’re cool.

691
00:52:36,000 –> 00:52:37,260
And your brain is totally there.

692
00:52:37,260 –> 00:52:41,280
And for me, everything, everything after
that happens in slow motion, basically.

693
00:52:42,625 –> 00:52:45,865
I’m I observe the audience while I am talking.

694
00:52:45,865 –> 00:52:52,555
And I think about myself, how to walk to which position in
the room next, or if it’s now the right time to do a joke,

695
00:52:52,585 –> 00:52:55,825
or I don’t know, because I like to improve, improvise a lot.

696
00:52:55,915 –> 00:52:56,665
Do you want me to Alex?

697
00:52:56,905 –> 00:53:05,355
And that all are some aspects that are doable after you
acknowledged yourself being a very self-aware being.

698
00:53:06,250 –> 00:53:10,180
And looking back in history, it’s not
that old, the idea of the enlightenment.

699
00:53:10,420 –> 00:53:15,250
Everyone can be like self-aware and can
change their own biography and learn whatever

700
00:53:15,250 –> 00:53:17,590
they want and to do a job, whichever.

701
00:53:18,805 –> 00:53:24,565
And a lot of people are more and more people are able to
do that, but we still have like communities or countries,

702
00:53:24,745 –> 00:53:31,045
like for instance, like Pakistan, where we have this
beautiful, the story of Malala, very inspiring, a girl

703
00:53:31,285 –> 00:53:33,535
who wanted just to have like education, basic education.

704
00:53:33,805 –> 00:53:40,435
And she wasn’t actually able to do that because
the Taliban regime was there and they were still

705
00:53:40,435 –> 00:53:42,725
proclaiming today that education is not made.

706
00:53:43,570 –> 00:53:44,260
Females.

707
00:53:44,770 –> 00:53:50,410
But on the other hand, we have this story and we have
this person who did it and who was able to get out

708
00:53:50,410 –> 00:53:56,650
of that system and become part of the other, I don’t
know, community, the global citizenship, I think

709
00:53:56,650 –> 00:54:03,160
because when I learned something about like humanism,
two dot, oh, we could phrase it like that, that

710
00:54:03,220 –> 00:54:08,560
some, like everyone should have equal rights equally.

711
00:54:09,295 –> 00:54:10,795
Opportunities equal.

712
00:54:10,945 –> 00:54:12,265
I don’t know, not equal payment.

713
00:54:12,385 –> 00:54:13,165
That’s not the case.

714
00:54:13,865 –> 00:54:14,965
That’s not the topic.

715
00:54:15,625 –> 00:54:24,025
I think equality comes from very individual
biographies, but like the access to education should be.

716
00:54:25,405 –> 00:54:30,675
Uh, it should be there for every beat
and it starts with basic human rights.

717
00:54:30,675 –> 00:54:37,125
It starts with basic internet connection, which we still
cope with here in Germany, because it’s still so slow.

718
00:54:37,125 –> 00:54:43,245
And we have so many fields regions where we don’t have
fast internet, but I dunno, you get the idea of, and

719
00:54:43,245 –> 00:54:48,465
that all, all that I think starts with a very broad.

720
00:54:49,645 –> 00:54:56,545
Understanding of mindfulness of what it means to be not
just in the moment, but also for myself being in the future

721
00:54:56,545 –> 00:55:01,585
scenarios, being in the history of scenarios, being with
friends, being with family, being with people, I don’t

722
00:55:01,585 –> 00:55:04,975
know, I haven’t seen before, or even people want to harm me.

723
00:55:05,455 –> 00:55:06,555
I also dealt with.

724
00:55:07,420 –> 00:55:09,340
Very very scary situations.

725
00:55:09,770 –> 00:55:16,060
I’ve been traveling around before the COVID pandemic into
many, many countries, like also involving very dark streets,

726
00:55:16,270 –> 00:55:22,180
somewhere in Cape town or in Metagene, or I don’t know,
I’ve been to China and experienced for the first time,

727
00:55:22,180 –> 00:55:24,550
what it means to be discriminated discriminated against.

728
00:55:25,240 –> 00:55:32,410
And it’s, I think so powerful to do all
these things and to get to know the world.

729
00:55:33,370 –> 00:55:40,180
Including issues and challenges as well as charge
chances and opportunities and people, primarily

730
00:55:40,180 –> 00:55:47,140
people to get to know those people better and to
understand their pains and, and every perspective.

731
00:55:48,445 –> 00:55:53,845
Drastically, whenever you move beyond boundaries
may be like the boundary of your hometown,

732
00:55:53,845 –> 00:55:55,765
or it starts with your flat or your house.

733
00:55:56,065 –> 00:55:59,035
Maybe you live in a valley or maybe
you should someday leave that valley.

734
00:55:59,275 –> 00:56:02,635
Sometimes you should leave the country or
even the continent, or I don’t know, go

735
00:56:02,635 –> 00:56:04,735
to a sailing boat and sail to New York.

736
00:56:04,765 –> 00:56:10,315
I heard that this should be a trend right
now through firefighters for future stuff.

737
00:56:10,585 –> 00:56:10,945
I don’t know.

738
00:56:12,130 –> 00:56:17,020
And that’s the last sentence that it’s important
to embrace different or also differing or

739
00:56:17,020 –> 00:56:24,760
competing perspectives, but stay respectful, um, in
contrast or for the environment like other people.

740
00:56:24,760 –> 00:56:30,130
And as long as that is granted, I think we shouldn’t have
any more conflicts anymore, but that’s a long way to go.

741
00:56:33,745 –> 00:56:41,815
My final question is really when you fast forward 100
years from now and looking back when people think of

742
00:56:41,815 –> 00:56:44,845
you, what do you want people to be saying about you?

743
00:56:46,365 –> 00:56:46,875
Wow.

744
00:56:46,905 –> 00:56:48,345
That’s a very powerful question.

745
00:56:49,815 –> 00:56:53,245
I hope there is still set that
I passed away some five years.

746
00:56:54,700 –> 00:56:59,920
So that implies that I’ve obviously lived
longer than a hundred years, but then I will

747
00:56:59,920 –> 00:57:05,320
have taken the decision on my own to leave this
planet, to make space for other generations.

748
00:57:05,890 –> 00:57:10,450
I think most of the ideas I love to spread
and they change over time in the first place.

749
00:57:10,630 –> 00:57:12,940
And in the second place, there is so many ideas.

750
00:57:13,795 –> 00:57:21,535
Or movements or changes I’d like to, or
processes I’d like to initiate in people’s

751
00:57:21,535 –> 00:57:24,295
minds that are not really linked to my person.

752
00:57:24,595 –> 00:57:29,305
I lost that one, one entrepreneur,
and he’s around 63 years old.

753
00:57:29,305 –> 00:57:37,345
I met him four years ago or so he’s still so grateful
that we met back then and in two weeks from now.

754
00:57:37,345 –> 00:57:38,695
So at the end of a.

755
00:57:39,835 –> 00:57:42,145
We will share the same stage for the first time.

756
00:57:42,195 –> 00:57:46,825
Because after that, he, he started his own
business from being like a, a C level of

757
00:57:46,825 –> 00:57:48,895
manager from a company in the west of Germany.

758
00:57:49,075 –> 00:57:50,935
And now he’s like freelancer on his own.

759
00:57:51,025 –> 00:57:54,625
And it does like purposeful things or meaningful
things for him, but, and he’s still so rightful.

760
00:57:54,625 –> 00:57:58,305
And sometimes he used to tweet
about that on Twitter and right.

761
00:57:58,365 –> 00:58:04,935
I think it’s like, and still you, I got lost that
you showed me this path and I’m still so lucky.

762
00:58:06,535 –> 00:58:10,975
Or having had the chance to embrace,
change and futures and stuff like that.

763
00:58:11,305 –> 00:58:15,715
And of course, I love that it’s better than any, any fear.

764
00:58:15,775 –> 00:58:20,395
Some client can pay me for a keynote or for a,
I don’t know, for a book or so actually I’m so

765
00:58:20,395 –> 00:58:23,035
totally generation, Y I don’t care about money.

766
00:58:23,035 –> 00:58:28,375
I just care about ideas and like seeing
the spark in the eyes of my audience.

767
00:58:28,705 –> 00:58:29,665
And that’s so cool.

768
00:58:29,785 –> 00:58:33,925
It’s so if I could choose it would
probably be that in a hundred years.

769
00:58:35,005 –> 00:58:39,535
People who would still remember
some encounter command with me.

770
00:58:40,075 –> 00:58:46,555
I don’t know where if it’s been on a stage or in a
lobby or in the train, I’m driving so much train.

771
00:58:46,555 –> 00:58:48,265
So the chances to meet me at the train are very high.

772
00:58:49,285 –> 00:58:54,595
So like in such a conversation that the people say
that’s so cool because I started rethinking things and

773
00:58:54,595 –> 00:59:02,635
he didn’t give me the answer to my life question, but
it was like, he showed me at least a couple of times.

774
00:59:03,325 –> 00:59:06,025
And I found out after that, that this was the right door.

775
00:59:06,025 –> 00:59:11,905
And if it wasn’t, I don’t care anymore because it was
the right door afterwards because of what I made from it.

776
00:59:12,625 –> 00:59:15,385
That’s actually the only thing I’d like to be remembered.

777
00:59:16,395 –> 00:59:16,755
Great.

778
00:59:16,995 –> 00:59:17,985
Thank you very much.

779
00:59:18,465 –> 00:59:19,395
Thank you so much.

780
00:59:20,175 –> 00:59:21,495
It was a great conversation.

781
00:59:21,705 –> 00:59:23,145
I hope I have to go back one day.

782
00:59:23,205 –> 00:59:23,925
It’s continued.

783
00:59:24,045 –> 00:59:24,645
Yeah, same.

784
00:59:24,855 –> 00:59:26,155
Thank you so much for having me Xerxes.

785
00:59:26,175 –> 00:59:26,325
Yeah.

786
00:59:26,415 –> 00:59:26,805
Thanks.

787
00:59:26,985 –> 00:59:27,255
Bye.

788
00:59:29,975 –> 00:59:34,835
Thank you for staying tuned for this
edition of challenging paradigm X.

789
00:59:35,255 –> 00:59:39,968
If you like this episode with kayak on that,
feel free to share with your community.

790
00:59:40,046 –> 00:59:46,115
So Kai’s message gets spread even further in the
show notes, you’ll find the links to Kai’s work.

791
00:59:46,550 –> 00:59:49,250
Please hit subscribe and rate my podcast.

792
00:59:49,280 –> 00:59:55,781
If you like it, I’d also be very glad if you write
me a review, If you have any questions or comments,

793
00:59:55,811 –> 01:00:04,091
feel free to contact me next week, we are up with
another edition of challenging paradigm X until then.

794
01:00:04,181 –> 01:00:06,041
I wish you a great week.

 

Contact

GET CONNECTED

We welcome you to contact us for more information
about any of our services.

Let’s Talk About A Keynote At Your Event

Xerxes offers keynotes on future technology and the future of humanity as well as tailor-made keynotes on request.

Let’s Talk About Your Future Projects

Xerxes serves as a sounding board, catalyst and innovator for your future projects.

Let’s Talk About Your Strategy

Xerxes and his team offer strategy consulting, strategic PR and organization development services.

CONTACT US:
Contact