[00:10.560 --> 00:13.120] Good morning, everyone. [00:13.340 --> 00:19.540] I have given the name of my talk today, We Are What We Share. [00:19.540 --> 00:29.980] And since you were so shy to ask a question earlier, I'm going to get you all to think about a question right now and to chat with your table. [00:29.980 --> 00:31.200] About this question. [00:31.560 --> 00:33.480] So, we'll just bring up the visuals. [00:33.680 --> 00:34.720] Actually, you know what? [00:34.720 --> 00:36.840] Let's first start with this very short video. [00:42.000 --> 00:46.540] I want to post about how great this coffee is, but I can't think of a funny way to say it. [00:46.540 --> 00:48.340] This post is like a page long. [00:48.340 --> 00:49.680] How do I shorten this? [00:49.680 --> 00:51.220] Just dig out all the vowels. [00:52.660 --> 00:53.420] Seriously. [00:54.560 --> 00:56.680] Do you eat those or just take pictures? [00:56.740 --> 00:58.680] He's the cutest baby on the internet. [00:58.680 --> 01:00.640] Is this guy always on vacation? [01:00.640 --> 01:02.200] Costa Rica vacay. [01:02.200 --> 01:04.840] Did you just like my post about my uncle dying? [01:04.840 --> 01:05.960] Office clerk. [01:06.000 --> 01:06.420] No. [01:06.420 --> 01:07.580] Hold still, hold still. [01:07.580 --> 01:08.660] I'm vining you. [01:08.660 --> 01:10.320] Why are you Instagramming lambs? [01:10.320 --> 01:12.120] How did he get so many followers? [01:12.120 --> 01:13.300] He's a dentist. [01:13.460 --> 01:14.500] Hey, guys. [01:14.500 --> 01:15.600] You on Twitter? [01:15.660 --> 01:16.500] Follow me. [01:16.500 --> 01:18.260] We just got two retweets. [01:18.260 --> 01:21.420] Sometimes I want to move to another country where we're not the deal with this stuff. [01:21.660 --> 01:21.900] What's up, uncle? [01:21.900 --> 01:24.140] Gigi, Gigi, Gori, I'm tweeting right now. [01:24.140 --> 01:25.900] Does this look like I'm traveling? [01:26.020 --> 01:27.180] Hashtag photo of the day. [01:27.640 --> 01:28.080] Hashtag... [01:28.080 --> 01:28.880] I quit. [01:28.880 --> 01:29.580] Just kidding. [01:29.580 --> 01:31.620] Hashtag multitasking. [01:31.620 --> 01:32.860] Hashtag squirmworm. [01:32.860 --> 01:34.000] Hashtag road trip dudes. [01:34.000 --> 01:35.160] Not while you're driving, man. [01:35.340 --> 01:36.900] Hashtag yo boy. [01:36.920 --> 01:38.700] Is anybody even going to read this? [01:38.780 --> 01:39.320] Fasty. [01:39.320 --> 01:40.100] Coffee friends. [01:40.100 --> 01:40.720] Mini bagels. [01:40.720 --> 01:41.420] Swish. [01:41.500 --> 01:42.180] Unsubscribe. [01:42.180 --> 01:43.100] What's up, Facebook? [01:43.100 --> 01:44.100] Oh, look at my new shoes. [01:44.240 --> 01:44.840] Unfollow. [01:44.840 --> 01:45.940] I love coffee. [01:45.940 --> 01:47.160] We're going viral. [01:47.160 --> 01:48.180] I don't care about your dog. [01:48.180 --> 01:49.480] I don't care about your lunch. [01:49.480 --> 01:50.220] Mail Archie. [01:50.220 --> 01:51.500] Nobody cares about your tree pics. [01:51.500 --> 01:52.460] Driving selfie. [01:52.460 --> 01:53.140] Glaciers. [01:53.140 --> 01:53.700] Desert. [01:53.700 --> 01:54.460] Third world. [01:54.460 --> 01:55.880] Nobody cares. [02:00.000 --> 02:01.880] I'm mad at all of you. [02:03.260 --> 02:06.020] Dude, I made the popular page. [02:07.980 --> 02:11.560] I thought that was really funny when I saw it. [02:11.560 --> 02:15.900] So that is an interesting snapshot, right, of how much we share. [02:15.900 --> 02:24.380] And the question that I have today for us to reflect on, at least for me to reflect on with you, is how we are in this generation of sharing. [02:24.380 --> 02:34.850] We think about privacy, and I think privacy represents concerns and protections and violations, and yet we're in this era of sharing. [02:35.260 --> 02:46.240] How many of you have someone that you know, potentially in your family, that's under the age of 10, that has their own social media account, like Instagram or Facebook, and that shares? [02:46.520 --> 02:48.080] Come on, it's okay. [02:48.100 --> 02:49.050] Raise those hands. [02:49.500 --> 02:51.100] Okay, a few of you. [02:51.100 --> 03:03.280] So my first suggestion for the Privacy Commissioner is to consult 10-year-olds, so that we can think about, you know, for 10-year-olds to think about what they want their digital footprint to look like when they're 40. [03:03.280 --> 03:08.120] Because those who are 40 today, they didn't have a social media account when they were 10. [03:08.560 --> 03:20.540] Or maybe those babies that are born in the womb, you know, their presence started before they even entered the world with those lovely maternity shots, and I'm guilty of posting some of those myself. [03:20.540 --> 03:31.460] And it's really a big question about how what we share is a reflection of who we are, and how do we think of privacy, not so much from a fear-based perspective, but from an opportunity. [03:31.640 --> 03:40.780] And so what I want to reflect on today are a bit of those opportunities, and also the motivations, and just to understand what are some of those motivations that we have. [03:40.860 --> 03:44.540] But first, there's going to be a question that I have for you. [03:44.540 --> 03:53.380] So with a partner, share an example of how technology has profoundly changed your life in some way, and I'm going to give you 60 seconds just to share with someone at your table. [03:53.400 --> 03:54.360] Please go ahead. [04:30.560 --> 04:32.280] All right, 10 seconds. [04:42.500 --> 04:50.600] Okay, so there is a roaming mic, and I hope you carry these conversations over during the break and over the next few days. [04:54.220 --> 04:55.980] So who would like to share with the group? [04:55.980 --> 04:56.860] Just raise your hand. [04:56.860 --> 04:58.400] There's a few roaming mics. [04:59.160 --> 05:01.960] I'm looking for at least two people to share. [05:02.780 --> 05:06.560] So if you can raise your hand and someone with a microphone will come to you. [05:07.120 --> 05:09.180] I can't see the hand, but I see the mic. [05:10.280 --> 05:11.720] All right, go ahead. [05:12.680 --> 05:25.360] Hi, my husband, his long-lost niece, who'd been estranged from the family for 35 years, found him through Facebook, and he got an email that said, are you from Newfoundland? [05:25.360 --> 05:26.660] I think you're my uncle. [05:26.660 --> 05:34.420] And they were able to reconnect, and she's come back into the family, and so that's a good success story that's come out of Facebook for our family. [05:34.420 --> 05:35.720] Thank you, thank you for sharing. [05:35.860 --> 05:37.060] Round of applause. [05:39.140 --> 05:41.240] And one more, one more example. [05:42.060 --> 05:44.080] Finding a long-lost family member is the first example. [05:44.080 --> 05:48.060] I see a hand up over here, right over here at this table. [05:48.980 --> 05:50.060] Yeah, thanks. [05:52.000 --> 06:00.660] So I'm originally from Zimbabwe, and when I wanted to move, I found my job in Barbados on a professional website. [06:00.660 --> 06:05.760] And then when I moved from Barbados to Canada, I also found my next job through a professional website. [06:05.760 --> 06:08.180] All right, website, thank you. [06:10.080 --> 06:15.720] Yeah, it's great to be able to have your career be enhanced, right, through these opportunities. [06:17.220 --> 06:28.380] This generation of young people have been born into bits, stated by Don Tapscott, who wrote Growing Up Digital, The Rise of the Net Generation, back in 1997. [06:28.380 --> 06:30.440] I met him there at the book launch. [06:30.480 --> 06:35.000] And my co-founder of Taking It Global, we started this organization in 1999. [06:35.020 --> 06:36.680] He was featured in the book. [06:36.680 --> 06:41.300] He was on Oprah as one of the whiz kids, Growing Up Digital. [06:41.760 --> 06:49.160] And we have been now outpaced by those sort of born, right, fully digital. [06:49.160 --> 06:56.920] There's over 90% of teens who spend a lot of time online in the US and in North America. [06:56.920 --> 07:03.740] And of course, with smartphones, people are constantly socially connected and across multiple platforms. [07:04.040 --> 07:09.640] And some of those platforms are more open, and some of them, maybe people perceive them to be closed. [07:09.640 --> 07:12.400] But it's part of our world today. [07:12.560 --> 07:28.080] And actually, when we think about the role of individuals and in how there's a lot of responsibility for consumers to make these choices, most of the time, especially when we're dealing with teenagers or even those who are under teenage years, like seven-year-olds, [07:28.080 --> 07:32.480] basically you can read and write and then you can create an online presence. [07:32.480 --> 07:34.860] And some parents are aware and some aren't. [07:34.940 --> 07:50.240] And so they're not actually gaining the skills that they need to participate in a really meaningful way because in school, they're not really talking about it in a way that's embracing the technology responsibly. [07:50.460 --> 08:07.500] And so that's where I think that there's a huge opportunity to work with the enthusiasm that young people have for the technology to become more mindful about the benefits that it has to offer and to really be more empowered with that information in a way that's accessible. [08:07.500 --> 08:12.600] Because sometimes it's not just about having the information out there, but being able to understand it. [08:12.600 --> 08:19.560] So we have to think about that seven-year-old who may or may not be getting their parents' permission to create a profile. [08:19.660 --> 08:26.300] And how do we present the information, you know, for consumer rights and responsibilities for seven-year-olds? [08:26.340 --> 08:30.700] And I didn't think about the seven-year-old thing until I heard this morning's keynote. [08:30.700 --> 08:34.180] I was thinking more of teenagers and 20-year-olds and 30-year-olds. [08:34.200 --> 08:42.720] And then it hit me that I actually have a lot of young children and my cousins in my circle that are posting a lot about their lives online. [08:43.000 --> 08:45.660] And I think it can be a really beautiful thing. [08:45.660 --> 08:47.920] And of course, it can also be a scary thing. [08:47.920 --> 08:55.640] So I was in Winnipeg at the Canadian Museum of Human Rights a few weeks ago for the Canadian Commission for UNESCO. [08:55.640 --> 09:00.740] And I was running a workshop in what is called the Garden of Contemplation, a very beautiful room. [09:00.960 --> 09:06.680] And as I was doing the workshop, when I got started, one of the teens who was in the room, she took out her phone. [09:06.680 --> 09:11.180] And someone came up to me and she said, she's not paying attention, she's using her phone. [09:11.540 --> 09:15.020] And I said, okay, how many of you have a phone in your pocket? [09:15.020 --> 09:16.840] Pretty much 100% of the hands went up. [09:16.840 --> 09:19.440] And how many of you post things with your phones? [09:19.440 --> 09:21.180] How many of your phones have cameras? [09:21.180 --> 09:22.760] Pretty much everyone's hand went up. [09:22.760 --> 09:24.660] All right, let's use your phones. [09:24.720 --> 09:34.080] And I said, how many of you post things online to make a social message, to make a statement about an issue that you care about? [09:34.080 --> 09:36.180] And a very small percentage of hands went up. [09:37.060 --> 09:48.760] So I felt a bit concerned as someone who was running a workshop, you know, in the Museum of Human Rights, and it was about girls' education and how so many children around the world don't have access to education. [09:48.760 --> 09:57.380] So the workshop was, how can we be agents of change in promoting the right to quality and culturally relevant education around the world? [09:57.380 --> 09:59.320] So then I said, all right, let's use our phones. [09:59.320 --> 10:01.960] And they created little art pieces. [10:01.960 --> 10:04.140] And with their phones, they took photos. [10:04.140 --> 10:08.500] You can see in the image, you can bring the image back up on the screen there. [10:08.500 --> 10:11.160] And so these are the images that the students created. [10:11.160 --> 10:15.920] And then they took photos of those little art pieces, and they posted them online with the hashtag. [10:15.920 --> 10:20.760] But that little exercise was not something that they had ever done before. [10:20.900 --> 10:28.800] And so I think it's important for us to integrate this culture of sharing that we have in a socially responsible way. [10:28.800 --> 10:41.360] And that way people can really feel empowered, both about their rights and responsibilities, and also to contribute to these dialogues about how the industry is changing so quickly. [10:41.360 --> 10:45.980] And it's probably going to have the greatest impact on the youngest people. [10:46.440 --> 10:50.880] So when I was a teenager, I read a book called 1984. [10:51.960 --> 10:53.360] How many of you read that book? [10:53.360 --> 10:54.600] Most of us. [10:54.600 --> 10:56.740] So it's like the eyes of Big Brother. [10:56.740 --> 11:00.280] You know, the worst case scenario, you're being watched everywhere you go. [11:00.280 --> 11:02.840] I had to write an essay, and I was quite distressed. [11:02.840 --> 11:04.560] So I ended up making this painting. [11:04.560 --> 11:06.940] This is an actual photo of the painting I created. [11:07.000 --> 11:10.300] And instead of bringing an essay to school, I asked for an extension. [11:10.380 --> 11:12.900] I couldn't write, but all I had was this painting. [11:12.900 --> 11:16.580] And the head of the English department sat with me and deconstructed the painting. [11:16.580 --> 11:17.960] He said, oh, that's Winston. [11:18.040 --> 11:20.120] And he's looking into the future. [11:20.120 --> 11:24.220] The eyes of Big Brother and the flames of the past are burning behind him. [11:24.240 --> 11:31.280] And so this was an early encounter that I had about this whole idea of Big Brother, big data, what does this all mean? [11:31.280 --> 11:39.760] And in the midst of a world where, you know, I'm constantly accepting privacy policies just to be able to access things. [11:39.760 --> 11:47.000] I'm not really thinking about a fear-based mindset, about what is the worst that could happen. [11:47.500 --> 11:50.280] And I guess it's important for us to have that balance. [11:50.280 --> 11:52.900] We're focusing on opportunities and possibilities. [11:53.200 --> 11:57.820] But then when I hear the word privacy, I think about fear and constraint and violation. [11:57.820 --> 12:05.700] And so I wonder how we can have conversations about privacy in a way that actually focuses on possibilities. [12:05.700 --> 12:13.880] And perhaps it's even looking at the way the systems are structured, like being only responsive to complaints as a system. [12:13.880 --> 12:29.020] How do we create more proactive, interactive dialogue as a society, as Canadian citizens across all ages to be able to shape both policy and practice and education in a way that is really meaningful? [12:30.120 --> 12:31.760] So why do we share? [12:32.040 --> 12:37.500] So this is me in Churchill, Manitoba, a few months ago with Polar Bears International. [12:38.040 --> 12:41.860] And I think there's many reasons that we can all offer. [12:41.860 --> 12:44.360] I'm going to share a few today that I've come up with. [12:44.360 --> 12:46.120] But this was a moment for me. [12:46.120 --> 12:50.180] So oftentimes it's sharing a moment with others. [12:50.180 --> 12:58.560] So I spent a week on the tundra with scientists and polar bear experts, learning about Polar Bears International. [12:58.560 --> 13:04.920] So part of it was sharing to inform others, but also part of this collective memory. [13:04.980 --> 13:07.760] So you might take a photo, but why would you share a photo with people? [13:07.760 --> 13:09.680] Because we had that shared experience. [13:09.900 --> 13:18.360] So it's a place where a lot of people now are actually storing our collective memory with small groups, and then we're expanding those groups. [13:18.360 --> 13:25.840] And as humans, we think about Maslow's hierarchy of needs, the need to relate to others as a core human need. [13:25.960 --> 13:29.600] And storytelling is very much part of who we are as people. [13:29.600 --> 13:35.400] And it's in some ways what defines us sort of uniquely as a species on the planet. [13:35.480 --> 13:40.520] So that idea of collective storytelling is a huge reason for why we share. [13:40.920 --> 13:43.040] And it also relates to identity. [13:43.040 --> 13:46.000] It helps us to understand who we are as people. [13:46.000 --> 13:53.020] And we think about youth, because I ended up doing my whole master's research around the role of youth in society and how do you define youth. [13:53.020 --> 13:58.600] And a lot of times youth is defined by a stage in life when you're exploring who you are. [13:58.680 --> 14:03.300] So when people are sharing different identities online, it's an exploration. [14:03.300 --> 14:08.960] Whether you're online or offline, you also think about youth with rebellion, people taking risky decisions. [14:09.240 --> 14:25.060] And so how could we take that stage in life and ensure that we can really enhance that exploration in a way that doesn't cause long-term harm or damage in the rest or in later years of their life, or maybe shorter term in their life. [14:25.060 --> 14:28.060] So how do we support the exploration of identity? [14:28.060 --> 14:40.020] One project that we're working on at Taking It Global with support from Canadian Heritage is actually linking places in Canada of cultural, natural, and historic importance to personal identities. [14:40.020 --> 15:00.740] So right there you see two images of youth who have submitted to our platform where they're reflecting on, like in Vancouver, the Chinatown entrance is painted on her face to reflect that her identity is both informed by her roots and also an important part of Vancouver. [15:00.740 --> 15:04.980] So how are the places that surround us informing our identity? [15:04.980 --> 15:18.900] So we're trying to create these spaces for reflection and to also get people, in a way, using their phones to get off their phones and go explore places like galleries and museums and enjoy times in the park. [15:18.960 --> 15:24.520] 52% of online teens say that they have had an experience online that made them feel good about themselves. [15:24.520 --> 15:28.360] So really a motivation is to feel good about yourself, self-esteem. [15:29.080 --> 15:32.560] So teens also, you know, the flip side is there is that pressure, right? [15:32.560 --> 15:35.760] Like if all your friends are doing it, there's this whole FOMO, right? [15:35.760 --> 15:36.980] Fear of missing out. [15:36.980 --> 15:40.660] So sometimes people are sharing just because they're afraid of missing out. [15:40.660 --> 15:47.360] So we do want to be careful when we're sharing or you're seeing, you know, people sharing just because they're afraid. [15:47.360 --> 15:53.180] Because I think fear-based motivations is that that's where we can run into problems. [15:53.280 --> 15:58.800] And so it does, though, allow people to maintain friendship and connections. [15:58.800 --> 16:01.340] We heard a few examples of long-lost friends. [16:01.340 --> 16:06.490] And also it leads to a broader sense of inclusion. [16:07.130 --> 16:13.570] Being socially connected is an important part of what helps us thrive as people and as communities. [16:13.570 --> 16:19.730] And so through these different social connections that we have with people, we can gain access to new world views. [16:19.730 --> 16:23.230] Or if we're running into a problem, people can offer help. [16:23.490 --> 16:27.190] And oh, that was funny. [16:27.690 --> 16:30.070] Sorry, I turned it off. [16:30.810 --> 16:38.830] And so then, though, you have this blurry line between what you share that is private, like for a select group of friends. [16:39.110 --> 16:45.130] Peer research talks about how 60% of teen Facebook users keep their profiles private. [16:45.170 --> 16:49.050] And they most report high levels of confidence in their ability to manage their settings. [16:49.050 --> 16:53.170] So you think that what you're sharing is among a controlled group of people. [16:53.170 --> 17:00.890] But so quickly, once what you start sharing gets out there, you start essentially living your life as if you're just living a public life. [17:00.890 --> 17:06.010] And you just start to assume that everything is available for anyone to see. [17:06.010 --> 17:13.710] And once you start straddling that line, you're not really thinking about privacy very much. [17:13.710 --> 17:16.850] You're just assuming that everything is open. [17:17.070 --> 17:30.350] And so I think it's important to keep having these conversations and to really reflect on what it means for everything to be in the sort of public domain, or for people who think maybe that it's private, like, is it really private? [17:30.990 --> 17:39.790] But once you are in that public domain, you sort of jump from, let's say, with your friends on Facebook or whatever platform that you think is private. [17:39.790 --> 17:46.050] But then all of a sudden, you jump from being a student to workforce. [17:46.650 --> 17:50.630] And 92% of recruiters use social media in the recruiting process. [17:51.290 --> 17:56.810] With 76% viewing candidates sharing their volunteer work online being important. [17:56.810 --> 18:01.230] And one of three recruiters said that a limited social media presence was a negative thing. [18:01.250 --> 18:08.950] So pretty much you jump from thinking that your online presence is private to realizing it's essential for any form of employment. [18:08.950 --> 18:11.390] And it's like, at what point does it flip? [18:11.390 --> 18:14.570] And when do you start just living everything as public? [18:14.710 --> 18:16.370] So I think this is important. [18:16.370 --> 18:38.650] And we need to work with youth to help navigate these decisions and to be able to talk with one another and with future workplace settings, like with future employers, with organizations, institutions, to really think about what we're sharing and why and how it can serve our best interests and how it can serve the highest good for all. [18:38.690 --> 18:44.850] I know for myself, there's a lot that I share, and it definitely has affected my opportunities. [18:44.850 --> 18:51.570] Like, if I didn't have some sort of presence online, I would have never been nominated by the World Economic Forum as a young global leader. [18:51.570 --> 18:58.950] And then as I continue to share different ideas within this forum, I would often, you know, tweet my experiences or tweet my ideas. [18:58.950 --> 19:05.230] And suddenly, I was invited to be on a closing panel at the Summer Davos event in China. [19:05.230 --> 19:08.270] This is a photo from that in Dalian. [19:08.370 --> 19:11.170] But just to say that it's becoming essential. [19:11.170 --> 19:18.270] So then at some point, the lines are just so blurry between what is personal, what is professional, what is private, what is public. [19:18.570 --> 19:23.830] And so because those lines are blurry, I don't even know what privacy really means for youth today. [19:23.830 --> 19:28.950] And that's why I think it would be a really important group to consult in a meaningful way. [19:29.270 --> 19:32.770] In my work, I run a charity, Taking It Global is a charitable organization. [19:32.790 --> 19:40.050] I see huge benefit in sharing to support a social cause and also for momentum building. [19:40.170 --> 19:43.170] So this is a snapshot from Global Dignity Day. [19:43.170 --> 19:48.090] And we used video conferencing as a way to connect communities across Canada. [19:48.090 --> 19:52.370] It's actually part of a global movement in over 50 countries around the world. [19:52.370 --> 19:56.230] And it involves people sharing stories about dignity. [19:56.530 --> 20:03.690] So in the top right, there's an image of a survivor of the genocide in Rwanda, and he's sharing his story. [20:04.030 --> 20:08.710] And I was up in Arvi at Nunavut in the bottom left, and we did an arts-based project. [20:08.710 --> 20:15.850] And the idea of sharing with one another through live video conferencing, we recorded the whole thing, we posted it online. [20:15.850 --> 20:20.770] And it was all about building a movement towards a world filled with dignity. [20:20.770 --> 20:26.610] And for people to realize that the way we treat others can actually increase their dignity or decrease. [20:26.610 --> 20:28.590] So it's a whole way about preventing bullying. [20:28.590 --> 20:30.890] And bullying online is a huge issue. [20:30.890 --> 20:35.490] And I think it also leads to a lot of mental health concerns, right? [20:35.490 --> 20:38.570] Anxiety or depression, everything gets amplified online. [20:38.570 --> 20:40.390] So how do we amplify the good stuff? [20:40.390 --> 20:42.190] That's what I'm really interested in. [20:43.850 --> 20:51.390] There is also data that shows that those who support social movements online are more likely to engage in activism in real life. [20:51.390 --> 20:56.210] So I know there's a bit of cynicism around those who just click like to a lot of causes. [20:56.210 --> 20:57.590] It's called clicktivism. [20:57.590 --> 21:02.790] But there is data that shows that it does lead to greater involvement in the real world. [21:02.790 --> 21:06.330] And of course, there's other benefits to sharing online. [21:06.330 --> 21:10.250] For example, with Airbnb, it's new life experiences. [21:10.250 --> 21:13.990] You know, you share your home, you share maybe a cottage. [21:13.990 --> 21:17.170] I'm using someone else's cottage this summer. [21:17.170 --> 21:18.830] And it's different than a hotel. [21:18.830 --> 21:20.870] It's a more authentic experience. [21:20.870 --> 21:26.650] Or sharing interests like with Spotify, you know, with your friends being able to see what you're listening to right here and now. [21:26.650 --> 21:28.990] You know, we're sharing that interest with one another. [21:29.210 --> 21:32.150] Or maybe it's with parking in New Brunswick. [21:32.150 --> 21:35.450] I was just in New Brunswick and they have this hotspot parking meter. [21:35.450 --> 21:37.890] You pay through your phone and then it follows you. [21:37.890 --> 21:41.310] When you go to a certain restaurant, it knows you're there and dirt deals pop up. [21:41.310 --> 21:43.890] So it's making things convenient for us. [21:43.890 --> 21:45.450] And of course, recommendations. [21:45.490 --> 21:48.230] Amazon is like the king of recommendations. [21:48.430 --> 21:51.130] Following what we do and making suggestions. [21:51.130 --> 21:52.510] And sometimes those are amazing. [21:52.510 --> 21:55.890] Sometimes it can be really annoying that you just want to turn it off. [21:55.890 --> 21:59.150] You search for toilet paper and then all of a sudden it's just non-stop toilet paper. [22:01.810 --> 22:03.390] I do love Amazon. [22:03.860 --> 22:07.430] And then, of course, we share to understand one another. [22:07.430 --> 22:12.170] And my biggest hope is that we can create greater understanding across cultures. [22:12.170 --> 22:17.390] Oftentimes with social media networks, people are staying connected to those that they already know. [22:17.390 --> 22:23.710] And I'm interested in facilitating cross-cultural understanding across languages and regions of the world. [22:23.710 --> 22:29.850] This is a photo with a group of youth from 20 Arab countries hosted at the Library of Alexandria. [22:29.850 --> 22:39.670] And since 2004, Taking It Global has been working to run different regional programs as a way to essentially prevent terrorism by having open dialogue. [22:39.670 --> 22:46.270] And it is important to have a culture of sharing with one another in order to overcome misunderstandings. [22:46.990 --> 22:55.490] So more employers, 79%, say that knowledge and awareness of the wider world is important than those who value just a degree or grades. [22:55.490 --> 23:05.230] So we even know that an understanding of the world, which we understand by being an active participant in it, which involves sharing, that it will also help us in life. [23:05.750 --> 23:19.550] And so my plea is a support for a culture of digital citizenship that's really meaningful and not just based on a culture of fear, but also to see the possibilities and the opportunities and to invite critical dialogue. [23:19.990 --> 23:27.450] So if we're old enough to read and write and start creating a presence online, then we're basically, yes, under the voting age. [23:27.450 --> 23:30.150] Obviously, but I do have a young child. [23:30.150 --> 23:35.150] And so we need to start, you know, begin the dialogue as young as possible. [23:35.250 --> 23:39.730] This was a culminating event at Thunderbird House. [23:39.730 --> 23:49.710] And we're talking about how we can, in Winnipeg, and we're talking about how we can foster digital citizenship among youth. [23:49.710 --> 23:54.050] And they created different sculptures that all represent different themes. [23:54.050 --> 23:56.190] So how can young people be ambassadors? [23:56.190 --> 23:59.070] Um, so yeah, so this is my son. [23:59.070 --> 24:00.310] He's four now. [24:00.430 --> 24:10.490] And for any of you who have children or who have young nieces or nephews, just to reflect and to close on this question, like I wonder, what will he share? [24:10.530 --> 24:17.850] You know, how will I create an enabling environment for him and for like his children and maybe future, my future grandchildren? [24:18.090 --> 24:22.870] Because right now, anything I've ever posted in my life is somewhere online. [24:22.870 --> 24:33.470] And whether I posted or whether someone else posted it about me, and of course, there's a lot of data that's collected that I didn't actively post, just through searches or through activities. [24:33.730 --> 24:42.330] And so I hope that with all of the decisions and changes that are being made, that we put the next generation of children at the heart of it. [24:42.330 --> 24:45.010] So thank you for inviting me today as your morning keynote.