0:03
sounds good only 25 minutes but let's
0:05
sounds good only 25 minutes but let's
0:05
sounds good only 25 minutes but let's get started uh so as Simon introduced
0:08
get started uh so as Simon introduced
0:08
get started uh so as Simon introduced we're going be talking a little bit
0:09
we're going be talking a little bit
0:09
we're going be talking a little bit about my favorite topic which is data
0:11
about my favorite topic which is data
0:11
about my favorite topic which is data fetching unleashed the story about
0:14
fetching unleashed the story about
0:14
fetching unleashed the story about nextjs uh which we're was touched upon a
0:16
nextjs uh which we're was touched upon a
0:16
nextjs uh which we're was touched upon a little bit in the last talk react query
0:18
little bit in the last talk react query
0:18
little bit in the last talk react query and their BFF and we'll shortly find out
0:21
and their BFF and we'll shortly find out
0:21
and their BFF and we'll shortly find out that BFF doesn't just stand for best
0:23
that BFF doesn't just stand for best
0:23
that BFF doesn't just stand for best friend forever so a little bit about
0:25
friend forever so a little bit about
0:25
friend forever so a little bit about myself I'll try not to spend too much
0:27
myself I'll try not to spend too much
0:27
myself I'll try not to spend too much time on it uh my name is Farris uh as I
0:29
time on it uh my name is Farris uh as I
0:29
time on it uh my name is Farris uh as I was already introduced I'm an
0:31
was already introduced I'm an
0:31
was already introduced I'm an engineering manager uh frontend subject
0:33
engineering manager uh frontend subject
0:33
engineering manager uh frontend subject matter expert and also a conference
0:35
matter expert and also a conference
0:35
matter expert and also a conference speaker uh some of my previous
0:37
speaker uh some of my previous
0:37
speaker uh some of my previous experiences is uh working on connected
0:39
experiences is uh working on connected
0:39
experiences is uh working on connected TV applications digital asset management
0:42
TV applications digital asset management
0:42
TV applications digital asset management fintech and also in Fitness technology
0:45
fintech and also in Fitness technology
0:45
fintech and also in Fitness technology so I've worked with companies such as
0:47
so I've worked with companies such as
0:47
so I've worked with companies such as fit uh worked a little bit on projects
0:49
fit uh worked a little bit on projects
0:49
fit uh worked a little bit on projects with Discovery plus Eurosport gcn and
0:52
with Discovery plus Eurosport gcn and
0:52
with Discovery plus Eurosport gcn and then also a little bit of experience in
0:53
then also a little bit of experience in
0:53
then also a little bit of experience in the fintech world at a company called
0:55
the fintech world at a company called
0:55
the fintech world at a company called navro so what are my interests in the
0:58
navro so what are my interests in the
0:58
navro so what are my interests in the world of technology I love web
1:00
world of technology I love web
1:00
world of technology I love web performance we're going to be talking a
1:01
performance we're going to be talking a
1:01
performance we're going to be talking a little bit about it today along with
1:02
little bit about it today along with
1:02
little bit about it today along with architecture and engineering leadership
1:05
architecture and engineering leadership
1:05
architecture and engineering leadership and of course uh as software Engineers
1:07
and of course uh as software Engineers
1:07
and of course uh as software Engineers love I also contribute to open source
1:09
love I also contribute to open source
1:09
love I also contribute to open source projects uh predominantly one called Ray
1:11
projects uh predominantly one called Ray
1:11
projects uh predominantly one called Ray cost absolutely love it if you're a Mac
1:13
cost absolutely love it if you're a Mac
1:13
cost absolutely love it if you're a Mac User it's a really great way to uh
1:16
User it's a really great way to uh
1:16
User it's a really great way to uh enhance your Spotlight search and get a
1:18
enhance your Spotlight search and get a
1:18
enhance your Spotlight search and get a couple of extensions that you can build
1:19
couple of extensions that you can build
1:19
couple of extensions that you can build in with react native uh really really
1:22
in with react native uh really really
1:22
in with react native uh really really recommend checking it out and something
1:23
recommend checking it out and something
1:23
recommend checking it out and something I've been contributing to for the past
1:25
I've been contributing to for the past
1:25
I've been contributing to for the past couple of years so let's set the stage
1:28
couple of years so let's set the stage
1:28
couple of years so let's set the stage about what we're going to be talking
1:29
about what we're going to be talking
1:29
about what we're going to be talking about today so um there's no one siiz
1:32
about today so um there's no one siiz
1:32
about today so um there's no one siiz fits all every project has its own
1:35
fits all every project has its own
1:35
fits all every project has its own unique requirements and this talk is not
1:37
unique requirements and this talk is not
1:37
unique requirements and this talk is not about finding the one true way uh we're
1:40
about finding the one true way uh we're
1:40
about finding the one true way uh we're just going to be talking about a couple
1:41
just going to be talking about a couple
1:41
just going to be talking about a couple of Concepts this is not a tutorial we're
1:43
of Concepts this is not a tutorial we're
1:43
of Concepts this is not a tutorial we're not be building a full-fledged
1:44
not be building a full-fledged
1:44
not be building a full-fledged application but rather focusing on
1:46
application but rather focusing on
1:46
application but rather focusing on patterns things that we can apply to a
1:49
patterns things that we can apply to a
1:49
patterns things that we can apply to a multitude of projects everyone's going
1:51
multitude of projects everyone's going
1:51
multitude of projects everyone's going to be learning a couple of things today
1:53
to be learning a couple of things today
1:53
to be learning a couple of things today and they're going to be having their own
1:54
and they're going to be having their own
1:54
and they're going to be having their own unique projects with their unique
1:55
unique projects with their unique
1:55
unique projects with their unique requirements that they're working at
1:57
requirements that they're working at
1:57
requirements that they're working at their companies or personally and so we
2:00
their companies or personally and so we
2:00
their companies or personally and so we can't just copy and paste Solutions
2:01
can't just copy and paste Solutions
2:01
can't just copy and paste Solutions there but derive certain things and
2:04
there but derive certain things and
2:05
there but derive certain things and twist them in a way that they'll fit to
2:06
twist them in a way that they'll fit to
2:06
twist them in a way that they'll fit to the application that we're building
2:07
the application that we're building
2:07
the application that we're building personally so this is not going to be a
2:09
personally so this is not going to be a
2:09
personally so this is not going to be a one-size fits-all solution and then also
2:12
one-size fits-all solution and then also
2:12
one-size fits-all solution and then also we're going to be looking at a couple of
2:13
we're going to be looking at a couple of
2:13
we're going to be looking at a couple of advanced patterns today and there is
2:15
advanced patterns today and there is
2:15
advanced patterns today and there is such a thing as premature optimization
2:18
such a thing as premature optimization
2:19
such a thing as premature optimization there are libraries that we're going to
2:20
there are libraries that we're going to
2:20
there are libraries that we're going to be using that are powerful and it's
2:22
be using that are powerful and it's
2:22
be using that are powerful and it's important to understand if they're even
2:24
important to understand if they're even
2:24
important to understand if they're even needed in the first place if you're
2:26
needed in the first place if you're
2:26
needed in the first place if you're building a simple calculator app I don't
2:28
building a simple calculator app I don't
2:28
building a simple calculator app I don't think you need uh Global State
2:30
think you need uh Global State
2:31
think you need uh Global State Management and the most uh um the most
2:34
Management and the most uh um the most
2:34
Management and the most uh um the most incredible you know data fetching
2:35
incredible you know data fetching
2:36
incredible you know data fetching library and every single Library Under
2:38
library and every single Library Under
2:38
library and every single Library Under the Sun but it's really as we move
2:41
the Sun but it's really as we move
2:41
the Sun but it's really as we move through developing application as the
2:43
through developing application as the
2:43
through developing application as the requirements develop as it gets more
2:45
requirements develop as it gets more
2:45
requirements develop as it gets more complex and as they scale to more users
2:48
complex and as they scale to more users
2:48
complex and as they scale to more users bases with you know hundreds of users
2:50
bases with you know hundreds of users
2:50
bases with you know hundreds of users thousands millions of users then we
2:52
thousands millions of users then we
2:52
thousands millions of users then we apply new strategies to solving those
2:54
apply new strategies to solving those
2:54
apply new strategies to solving those problems and we're going to see through
2:56
problems and we're going to see through
2:56
problems and we're going to see through an evolution today of how to solve a
2:58
an evolution today of how to solve a
2:58
an evolution today of how to solve a problem that I looked at a couple of
2:59
problem that I looked at a couple of
2:59
problem that I looked at a couple of years ago go so what's the problem we're
3:01
years ago go so what's the problem we're
3:01
years ago go so what's the problem we're going to try and solve today what
3:03
going to try and solve today what
3:03
going to try and solve today what challenges do developers in particular
3:05
challenges do developers in particular
3:05
challenges do developers in particular frontend developers face in creating a
3:08
frontend developers face in creating a
3:08
frontend developers face in creating a seamless user experience due to complex
3:10
seamless user experience due to complex
3:10
seamless user experience due to complex data fetching and management so why
3:14
data fetching and management so why
3:14
data fetching and management so why bother looking at performance and how to
3:16
bother looking at performance and how to
3:16
bother looking at performance and how to do this really well what's the point of
3:17
do this really well what's the point of
3:18
do this really well what's the point of user experience well milliseconds are
3:20
user experience well milliseconds are
3:20
user experience well milliseconds are crucial for user satisfaction in the
3:22
crucial for user satisfaction in the
3:22
crucial for user satisfaction in the digital space quick load times are
3:24
digital space quick load times are
3:24
digital space quick load times are directly related to user happiness page
3:27
directly related to user happiness page
3:27
directly related to user happiness page load times that are in 0 to two seconds
3:29
load times that are in 0 to two seconds
3:29
load times that are in 0 to two seconds which is s of the golden window have the
3:31
which is s of the golden window have the
3:31
which is s of the golden window have the highest conversion rates which is really
3:33
highest conversion rates which is really
3:33
highest conversion rates which is really what you're looking for and then Google
3:35
what you're looking for and then Google
3:36
what you're looking for and then Google searches also show that around about 32%
3:39
searches also show that around about 32%
3:39
searches also show that around about 32% of an increase in the likelihood of a
3:40
of an increase in the likelihood of a
3:40
of an increase in the likelihood of a bounce rate when a page load time
3:42
bounce rate when a page load time
3:42
bounce rate when a page load time increases from 1 to 3 seconds The Source
3:44
increases from 1 to 3 seconds The Source
3:44
increases from 1 to 3 seconds The Source attached below for that so the faster
3:47
attached below for that so the faster
3:47
attached below for that so the faster your website loads the better experience
3:50
your website loads the better experience
3:50
your website loads the better experience users is going to have and that's not
3:52
users is going to have and that's not
3:52
users is going to have and that's not only about the initial load but through
3:54
only about the initial load but through
3:54
only about the initial load but through every single interaction in your UI if
3:56
every single interaction in your UI if
3:56
every single interaction in your UI if that reacts appropriately and the user
3:58
that reacts appropriately and the user
3:58
that reacts appropriately and the user gets the information they want they see
4:00
gets the information they want they see
4:00
gets the information they want they see the data that they want to then they're
4:02
the data that they want to then they're
4:02
the data that they want to then they're going to have a better experience going
4:03
going to have a better experience going
4:03
going to have a better experience going to have better retention rates you're
4:04
to have better retention rates you're
4:04
to have better retention rates you're going to convert them to paying users
4:06
going to convert them to paying users
4:06
going to convert them to paying users and that's at the end what you want spe
4:08
and that's at the end what you want spe
4:08
and that's at the end what you want spe specifically if you're building a SAS
4:10
specifically if you're building a SAS
4:10
specifically if you're building a SAS product and even more so one that is SEO
4:14
product and even more so one that is SEO
4:14
product and even more so one that is SEO driven one that people are just finding
4:15
driven one that people are just finding
4:15
driven one that people are just finding on Google you're clicking on them they
4:17
on Google you're clicking on them they
4:17
on Google you're clicking on them they don't know you your first introduction
4:19
don't know you your first introduction
4:19
don't know you your first introduction to a user is going to be on how well
4:22
to a user is going to be on how well
4:22
to a user is going to be on how well does your experience suit their needs
4:24
does your experience suit their needs
4:24
does your experience suit their needs especially when you're trying to solve a
4:26
especially when you're trying to solve a
4:26
especially when you're trying to solve a problem that requires a lot of data
4:28
problem that requires a lot of data
4:28
problem that requires a lot of data fetching and a lot of crucial
4:30
fetching and a lot of crucial
4:30
fetching and a lot of crucial calculations so what are the
4:32
calculations so what are the
4:32
calculations so what are the Technologies we're going to use today we
4:33
Technologies we're going to use today we
4:33
Technologies we're going to use today we touched a little bit on nextjs if react
4:35
touched a little bit on nextjs if react
4:35
touched a little bit on nextjs if react is the library then nextjs is the
4:37
is the library then nextjs is the
4:37
is the library then nextjs is the framework if you haven't looked at it
4:39
framework if you haven't looked at it
4:39
framework if you haven't looked at it before nextjs adds a little bit uh of
4:42
before nextjs adds a little bit uh of
4:42
before nextjs adds a little bit uh of interesting Concepts into uh reacts
4:45
interesting Concepts into uh reacts
4:45
interesting Concepts into uh reacts capabilities such as server side
4:46
capabilities such as server side
4:46
capabilities such as server side rendering so you can pre- Rend pages on
4:48
rendering so you can pre- Rend pages on
4:48
rendering so you can pre- Rend pages on the server you also have the ability to
4:51
the server you also have the ability to
4:51
the server you also have the ability to stat do static site generation really
4:53
stat do static site generation really
4:53
stat do static site generation really good for Content driven websites or
4:55
good for Content driven websites or
4:55
good for Content driven websites or documentation websites you've got file
4:57
documentation websites you've got file
4:57
documentation websites you've got file based routing rather than using things
4:59
based routing rather than using things
4:59
based routing rather than using things such as react router uh to manage things
5:02
such as react router uh to manage things
5:02
such as react router uh to manage things client side and then also API routes so
5:05
client side and then also API routes so
5:05
client side and then also API routes so it's a full stack framework you're able
5:07
it's a full stack framework you're able
5:07
it's a full stack framework you're able to actually design uh Express API
5:09
to actually design uh Express API
5:10
to actually design uh Express API handlers into it so it's all under One
5:12
handlers into it so it's all under One
5:12
handlers into it so it's all under One Singular code base then we're also going
5:14
Singular code base then we're also going
5:14
Singular code base then we're also going to touch on react query which is a
5:16
to touch on react query which is a
5:16
to touch on react query which is a really powerful asynchronous State
5:18
really powerful asynchronous State
5:18
really powerful asynchronous State Management library and for those of you
5:20
Management library and for those of you
5:20
Management library and for those of you who have looked at react query
5:22
who have looked at react query
5:22
who have looked at react query specifically look at the words powerful
5:23
specifically look at the words powerful
5:23
specifically look at the words powerful asynchronous State Management a big
5:27
asynchronous State Management a big
5:27
asynchronous State Management a big misconception with react query is that
5:29
misconception with react query is that
5:29
misconception with react query is that it's a data fetching library in fact
5:31
it's a data fetching library in fact
5:31
it's a data fetching library in fact actually it's not a data fetching
5:33
actually it's not a data fetching
5:33
actually it's not a data fetching Library it just so happens to be a state
5:35
Library it just so happens to be a state
5:35
Library it just so happens to be a state management Library which is really good
5:37
management Library which is really good
5:37
management Library which is really good at a synchronous TS such as data
5:40
at a synchronous TS such as data
5:40
at a synchronous TS such as data fetching so what does react query give
5:42
fetching so what does react query give
5:42
fetching so what does react query give us automatic caching and background
5:44
us automatic caching and background
5:44
us automatic caching and background updates optimistic updates that's when
5:47
updates optimistic updates that's when
5:47
updates optimistic updates that's when you send off a request you already
5:49
you send off a request you already
5:49
you send off a request you already update your UI before you have a
5:51
update your UI before you have a
5:51
update your UI before you have a response from the server and it really
5:52
response from the server and it really
5:52
response from the server and it really helps to do that seamlessly pagination
5:55
helps to do that seamlessly pagination
5:55
helps to do that seamlessly pagination infinite loading so when you're trying
5:57
infinite loading so when you're trying
5:57
infinite loading so when you're trying to get a lot of data from a server you
6:00
to get a lot of data from a server you
6:00
to get a lot of data from a server you sometimes split that up and show that in
6:02
sometimes split that up and show that in
6:02
sometimes split that up and show that in different pages that pagination is what
6:04
different pages that pagination is what
6:04
different pages that pagination is what that is and that's what re helps us do
6:06
that is and that's what re helps us do
6:06
that is and that's what re helps us do sometimes and then also prefetching so
6:08
sometimes and then also prefetching so
6:08
sometimes and then also prefetching so when we're on the server and we want to
6:10
when we're on the server and we want to
6:10
when we're on the server and we want to prefetch some data before we show it to
6:12
prefetch some data before we show it to
6:12
prefetch some data before we show it to the UI before the UI hydrates we're able
6:14
the UI before the UI hydrates we're able
6:14
the UI before the UI hydrates we're able to do so and then also query inv
6:17
to do so and then also query inv
6:17
to do so and then also query inv validation uh a very uh a very
6:20
validation uh a very uh a very
6:20
validation uh a very uh a very complicated thing to do when you are uh
6:22
complicated thing to do when you are uh
6:22
complicated thing to do when you are uh trying to update your UI and some when
6:25
trying to update your UI and some when
6:25
trying to update your UI and some when you perform a task that invalidates data
6:27
you perform a task that invalidates data
6:28
you perform a task that invalidates data that was previously fetched react query
6:30
that was previously fetched react query
6:30
that was previously fetched react query allows you to link queries together so
6:32
allows you to link queries together so
6:32
allows you to link queries together so that if I perform action a and then data
6:36
that if I perform action a and then data
6:36
that if I perform action a and then data set B BEC stale I'm able to perform
6:39
set B BEC stale I'm able to perform
6:39
set B BEC stale I'm able to perform another fetch to be able to get that
6:40
another fetch to be able to get that
6:41
another fetch to be able to get that data again so that just simplifies all
6:43
data again so that just simplifies all
6:43
data again so that just simplifies all those tasks of managing State and
6:45
those tasks of managing State and
6:45
those tasks of managing State and asynchronous work so Bare Bones data
6:49
asynchronous work so Bare Bones data
6:49
asynchronous work so Bare Bones data fetching we talked a little about the
6:51
fetching we talked a little about the
6:51
fetching we talked a little about the problem that we're working so we're
6:52
problem that we're working so we're
6:52
problem that we're working so we're going to put ourselves in the box of a
6:54
going to put ourselves in the box of a
6:54
going to put ourselves in the box of a scenario so I've worked on a lot of
6:56
scenario so I've worked on a lot of
6:56
scenario so I've worked on a lot of startups before and startups there are a
6:57
startups before and startups there are a
6:57
startups before and startups there are a lot of problems in terms of not
7:00
lot of problems in terms of not
7:00
lot of problems in terms of not everything is there and available to you
7:01
everything is there and available to you
7:01
everything is there and available to you when you need to solve a problem so what
7:03
when you need to solve a problem so what
7:03
when you need to solve a problem so what is the problem we're trying to solve
7:04
is the problem we're trying to solve
7:05
is the problem we're trying to solve today we've got a scenario imagine we've
7:07
today we've got a scenario imagine we've
7:07
today we've got a scenario imagine we've got a frontend team that needs to design
7:08
got a frontend team that needs to design
7:08
got a frontend team that needs to design a dashboard chart that displays the
7:11
a dashboard chart that displays the
7:11
a dashboard chart that displays the distribution of users birthdays uh for
7:14
distribution of users birthdays uh for
7:14
distribution of users birthdays uh for the top 5,000 users in the
7:16
the top 5,000 users in the
7:16
the top 5,000 users in the system what's our data source data is
7:19
system what's our data source data is
7:19
system what's our data source data is available via an API which returns a
7:21
available via an API which returns a
7:21
available via an API which returns a list of users with all the data so very
7:23
list of users with all the data so very
7:23
list of users with all the data so very typical scenario you've got a get
7:25
typical scenario you've got a get
7:25
typical scenario you've got a get request and it returns a bunch of
7:27
request and it returns a bunch of
7:27
request and it returns a bunch of information related to users
7:30
information related to users
7:30
information related to users now there's a limitation to solve this
7:32
now there's a limitation to solve this
7:32
now there's a limitation to solve this problem it's not so simple the backend
7:35
problem it's not so simple the backend
7:35
problem it's not so simple the backend team is unable to provide a new endpoint
7:37
team is unable to provide a new endpoint
7:37
team is unable to provide a new endpoint they're overcharged with work they're
7:38
they're overcharged with work they're
7:39
they're overcharged with work they're unable to get to a point where they're
7:40
unable to get to a point where they're
7:40
unable to get to a point where they're able to provide you an endpoint but the
7:41
able to provide you an endpoint but the
7:41
able to provide you an endpoint but the frontend team needs to still build this
7:43
frontend team needs to still build this
7:43
frontend team needs to still build this feature and get it
7:45
feature and get it shipped so they're unable to also
7:47
shipped so they're unable to also
7:47
shipped so they're unable to also provide a data warehouse solution and
7:50
provide a data warehouse solution and
7:50
provide a data warehouse solution and the required data and the proper data
7:52
the required data and the proper data
7:52
the required data and the proper data format so the data is going to come the
7:54
format so the data is going to come the
7:54
format so the data is going to come the way it's going to come and it's not
7:55
way it's going to come and it's not
7:55
way it's going to come and it's not necessarily going to come in a way
7:56
necessarily going to come in a way
7:56
necessarily going to come in a way that's suited to the need needs of our
7:58
that's suited to the need needs of our
7:58
that's suited to the need needs of our front end we're going to look at a
7:59
front end we're going to look at a
7:59
front end we're going to look at a visual example in a bit and there's a
8:01
visual example in a bit and there's a
8:01
visual example in a bit and there's a limit of a thousand users returned for
8:03
limit of a thousand users returned for
8:03
limit of a thousand users returned for each request so we saw a little bit in
8:04
each request so we saw a little bit in
8:04
each request so we saw a little bit in the scenario before that we need
8:06
the scenario before that we need
8:06
the scenario before that we need information display in a grph for 5,000
8:08
information display in a grph for 5,000
8:08
information display in a grph for 5,000 users each API request only returns a
8:11
users each API request only returns a
8:11
users each API request only returns a th000 at a time and so now we're faced
8:15
th000 at a time and so now we're faced
8:15
th000 at a time and so now we're faced with how do we build an optimal solution
8:18
with how do we build an optimal solution
8:18
with how do we build an optimal solution in a suboptimal
8:20
in a suboptimal scenario so if we look at the
8:22
scenario so if we look at the
8:22
scenario so if we look at the architectural diagram over here we can
8:24
architectural diagram over here we can
8:24
architectural diagram over here we can start to think about how we might build
8:26
start to think about how we might build
8:26
start to think about how we might build this application we're going to start
8:27
this application we're going to start
8:27
this application we're going to start real simple so I'm move my mouse we've
8:30
real simple so I'm move my mouse we've
8:30
real simple so I'm move my mouse we've got a user over here we have a user
8:32
got a user over here we have a user
8:32
got a user over here we have a user interface which is our client side the
8:33
interface which is our client side the
8:33
interface which is our client side the website that user is viewing and because
8:35
website that user is viewing and because
8:35
website that user is viewing and because our API can only give
8:37
our API can only give
8:37
our API can only give a, uh data points at a time we've got to
8:40
a, uh data points at a time we've got to
8:40
a, uh data points at a time we've got to make five parallel requests we're going
8:42
make five parallel requests we're going
8:42
make five parallel requests we're going to send off five requests at the same
8:44
to send off five requests at the same
8:44
to send off five requests at the same time from the client side when we send
8:46
time from the client side when we send
8:46
time from the client side when we send requests what's the typical thing that
8:48
requests what's the typical thing that
8:48
requests what's the typical thing that we do we show a loading State and while
8:51
we do we show a loading State and while
8:51
we do we show a loading State and while the loading state is happening we can
8:52
the loading state is happening we can
8:52
the loading state is happening we can either receive an unhappy response or a
8:56
either receive an unhappy response or a
8:56
either receive an unhappy response or a happy response of 200 so let's say we
8:59
happy response of 200 so let's say we
9:00
happy response of 200 so let's say we receive an unhappy response so we
9:02
receive an unhappy response so we
9:02
receive an unhappy response so we receive a 500 or whatever else servers
9:04
receive a 500 or whatever else servers
9:04
receive a 500 or whatever else servers down something like that is happening
9:06
down something like that is happening
9:06
down something like that is happening then we persist the loading State we're
9:08
then we persist the loading State we're
9:08
then we persist the loading State we're not going to show anything to the user
9:10
not going to show anything to the user
9:10
not going to show anything to the user at this point because what we're going
9:11
at this point because what we're going
9:11
at this point because what we're going to do is we're going to attempt to retry
9:13
to do is we're going to attempt to retry
9:13
to do is we're going to attempt to retry so what traditionally happens you make a
9:14
so what traditionally happens you make a
9:15
so what traditionally happens you make a request something fails you don't just
9:17
request something fails you don't just
9:17
request something fails you don't just give up you perhaps retry once or retry
9:20
give up you perhaps retry once or retry
9:20
give up you perhaps retry once or retry second time so if you're not using a
9:23
second time so if you're not using a
9:23
second time so if you're not using a library you'll generally have custom
9:24
library you'll generally have custom
9:24
library you'll generally have custom Logic for this and we're trying to look
9:26
Logic for this and we're trying to look
9:26
Logic for this and we're trying to look at the solution that we could build
9:27
at the solution that we could build
9:27
at the solution that we could build without integrating any libraries at
9:29
without integrating any libraries at
9:29
without integrating any libraries at this point point so you'd have custom
9:30
this point point so you'd have custom
9:30
this point point so you'd have custom Logic for a retry if you got a happy
9:32
Logic for a retry if you got a happy
9:32
Logic for a retry if you got a happy path you have the data response back
9:35
path you have the data response back
9:35
path you have the data response back from the promise you take the data and
9:37
from the promise you take the data and
9:37
from the promise you take the data and whatever format it is and you'd map it
9:39
whatever format it is and you'd map it
9:39
whatever format it is and you'd map it in a way that's suitable for the props
9:40
in a way that's suitable for the props
9:41
in a way that's suitable for the props you'll send to the Chart component
9:42
you'll send to the Chart component
9:42
you'll send to the Chart component you'll set it in state your component
9:45
you'll set it in state your component
9:45
you'll set it in state your component your UI will render and then let's say
9:47
your UI will render and then let's say
9:47
your UI will render and then let's say you have another piece of UI which you
9:49
you have another piece of UI which you
9:49
you have another piece of UI which you can perform an action on a button
9:51
can perform an action on a button
9:51
can perform an action on a button somewhere and when you click that button
9:53
somewhere and when you click that button
9:53
somewhere and when you click that button that's linked to State it sends another
9:55
that's linked to State it sends another
9:55
that's linked to State it sends another state dispatch and if you don't organize
9:58
state dispatch and if you don't organize
9:58
state dispatch and if you don't organize your state correctly a react State
10:00
your state correctly a react State
10:00
your state correctly a react State change can cause a render for a
10:02
change can cause a render for a
10:02
change can cause a render for a component that can cause a refetch of
10:04
component that can cause a refetch of
10:04
component that can cause a refetch of the data and if you're doing five
10:06
the data and if you're doing five
10:06
the data and if you're doing five parallel API requests and you're
10:08
parallel API requests and you're
10:08
parallel API requests and you're repeatedly refetch that due to other
10:10
repeatedly refetch that due to other
10:10
repeatedly refetch that due to other states changing the application that are
10:12
states changing the application that are
10:12
states changing the application that are not related to the graph that you're
10:14
not related to the graph that you're
10:14
not related to the graph that you're seeing you risk timing out the server
10:17
seeing you risk timing out the server
10:18
seeing you risk timing out the server with too many requests and receiving a
10:20
with too many requests and receiving a
10:20
with too many requests and receiving a 429 so that's a little bit of a problem
10:22
429 so that's a little bit of a problem
10:22
429 so that's a little bit of a problem we're going to see what that looks like
10:24
we're going to see what that looks like
10:24
we're going to see what that looks like in terms of code and then with a visual
10:26
in terms of code and then with a visual
10:26
in terms of code and then with a visual UI example so this is what the code may
10:28
UI example so this is what the code may
10:28
UI example so this is what the code may look like you may create a hook called
10:30
look like you may create a hook called
10:30
look like you may create a hook called use fetch data you may have a function
10:33
use fetch data you may have a function
10:33
use fetch data you may have a function that gets fake data right now this is
10:35
that gets fake data right now this is
10:35
that gets fake data right now this is getting fake data from a fake server uh
10:38
getting fake data from a fake server uh
10:38
getting fake data from a fake server uh that I built and then once it receives
10:40
that I built and then once it receives
10:40
that I built and then once it receives that it's going to map the data client
10:41
that it's going to map the data client
10:42
that it's going to map the data client side so it's going to get the birthdays
10:43
side so it's going to get the birthdays
10:43
side so it's going to get the birthdays by month it's going to do a little bit
10:44
by month it's going to do a little bit
10:44
by month it's going to do a little bit of work there it's then going to map it
10:46
of work there it's then going to map it
10:46
of work there it's then going to map it to the Chart data that and then it's
10:47
to the Chart data that and then it's
10:47
to the Chart data that and then it's going to set the data in
10:49
going to set the data in
10:49
going to set the data in state so let's look at a demo and see
10:52
state so let's look at a demo and see
10:52
state so let's look at a demo and see what this looks like when we bring it
10:53
what this looks like when we bring it
10:53
what this looks like when we bring it all together so we've got a chart really
10:56
all together so we've got a chart really
10:56
all together so we've got a chart really basic one and it's taking a little bit
10:58
basic one and it's taking a little bit
10:58
basic one and it's taking a little bit of time to load because we're making a
10:59
of time to load because we're making a
10:59
of time to load because we're making a lot of requests to Servant we're getting
11:00
lot of requests to Servant we're getting
11:00
lot of requests to Servant we're getting back a lot of information that's not a
11:03
back a lot of information that's not a
11:03
back a lot of information that's not a fantastic experience we go look at the
11:05
fantastic experience we go look at the
11:05
fantastic experience we go look at the network Tab and we go and click a hide
11:09
network Tab and we go and click a hide
11:09
network Tab and we go and click a hide and show button which toggle State
11:12
and show button which toggle State
11:12
and show button which toggle State because the state was not correctly set
11:14
because the state was not correctly set
11:14
because the state was not correctly set up you could actually impact the chart
11:17
up you could actually impact the chart
11:17
up you could actually impact the chart UI that was rendered and then when it
11:19
UI that was rendered and then when it
11:19
UI that was rendered and then when it appears back and it's loaded onto the
11:21
appears back and it's loaded onto the
11:21
appears back and it's loaded onto the Dom it triggers a side effect which then
11:24
Dom it triggers a side effect which then
11:24
Dom it triggers a side effect which then starts to fetch again which is not
11:26
starts to fetch again which is not
11:26
starts to fetch again which is not optimal because the data is not
11:27
optimal because the data is not
11:27
optimal because the data is not something need repeat needed updating
11:29
something need repeat needed updating
11:30
something need repeat needed updating you don't have new users added every
11:31
you don't have new users added every
11:31
you don't have new users added every single second so you don't need to
11:33
single second so you don't need to
11:33
single second so you don't need to update your data constantly you can
11:35
update your data constantly you can
11:35
update your data constantly you can actually cach it at some point in time
11:36
actually cach it at some point in time
11:36
actually cach it at some point in time but that's not something we're doing
11:38
but that's not something we're doing
11:38
but that's not something we're doing because our state is not set up well
11:40
because our state is not set up well
11:40
because our state is not set up well it's a little bit
11:41
it's a little bit problematic now if you had a look at the
11:43
problematic now if you had a look at the
11:43
problematic now if you had a look at the net requests maybe you noticed one
11:45
net requests maybe you noticed one
11:45
net requests maybe you noticed one thing the next slide shows that to fetch
11:49
thing the next slide shows that to fetch
11:49
thing the next slide shows that to fetch all that data five times a thousand
11:52
all that data five times a thousand
11:52
all that data five times a thousand 5,000 data points with a lot a lot of
11:55
5,000 data points with a lot a lot of
11:55
5,000 data points with a lot a lot of key value PS pairs in there that we did
11:58
key value PS pairs in there that we did
11:58
key value PS pairs in there that we did not need actually 2.3 megabytes of a
12:01
not need actually 2.3 megabytes of a
12:01
not need actually 2.3 megabytes of a payLo is received that's massive
12:04
payLo is received that's massive
12:04
payLo is received that's massive especially if you're querying that in
12:06
especially if you're querying that in
12:06
especially if you're querying that in countries where you've got limited data
12:09
countries where you've got limited data
12:09
countries where you've got limited data bandwidth where it's going to take even
12:11
bandwidth where it's going to take even
12:11
bandwidth where it's going to take even longer if you don't have you know a
12:12
longer if you don't have you know a
12:13
longer if you don't have you know a Wi-Fi connection or so on so forth or if
12:15
Wi-Fi connection or so on so forth or if
12:15
Wi-Fi connection or so on so forth or if you're on a data roaming packet using up
12:16
you're on a data roaming packet using up
12:16
you're on a data roaming packet using up a lot of that package just SOL simple
12:19
a lot of that package just SOL simple
12:19
a lot of that package just SOL simple website sometimes those things need to
12:20
website sometimes those things need to
12:20
website sometimes those things need to be optimized and by my calculations
12:22
be optimized and by my calculations
12:22
be optimized and by my calculations around about 73% of the data we received
12:25
around about 73% of the data we received
12:25
around about 73% of the data we received is actually unused and unnecessary to be
12:28
is actually unused and unnecessary to be
12:28
is actually unused and unnecessary to be able to show the chart that we want to
12:30
able to show the chart that we want to
12:30
able to show the chart that we want to so we need to rethink this solution
12:31
so we need to rethink this solution
12:31
so we need to rethink this solution because we're not really showing
12:33
because we're not really showing
12:33
because we're not really showing something
12:34
something optimal so couple of key factors custom
12:37
optimal so couple of key factors custom
12:37
optimal so couple of key factors custom retry if we developed a custom function
12:39
retry if we developed a custom function
12:39
retry if we developed a custom function to manage retries um it needs testing it
12:42
to manage retries um it needs testing it
12:42
to manage retries um it needs testing it needs maintenance it's subject to bugs
12:44
needs maintenance it's subject to bugs
12:44
needs maintenance it's subject to bugs retries we don't want to reinvent the
12:46
retries we don't want to reinvent the
12:46
retries we don't want to reinvent the wheel there are a lot of libraries that
12:47
wheel there are a lot of libraries that
12:47
wheel there are a lot of libraries that do that so generally they'll fit a
12:49
do that so generally they'll fit a
12:49
do that so generally they'll fit a solution caching we've got none by
12:51
solution caching we've got none by
12:51
solution caching we've got none by default and there's no reason why we
12:52
default and there's no reason why we
12:52
default and there's no reason why we can't cash that data so we're going to
12:53
can't cash that data so we're going to
12:54
can't cash that data so we're going to look at that in a little bit State
12:55
look at that in a little bit State
12:55
look at that in a little bit State complex to manage State globally and
12:58
complex to manage State globally and
12:58
complex to manage State globally and sync the data with staleness so with
13:01
sync the data with staleness so with
13:01
sync the data with staleness so with data staleness and so if you don't set
13:03
data staleness and so if you don't set
13:03
data staleness and so if you don't set up your react component structure and
13:05
up your react component structure and
13:05
up your react component structure and hierarchy correctly you may have parents
13:07
hierarchy correctly you may have parents
13:07
hierarchy correctly you may have parents that affect child components that cause
13:09
that affect child components that cause
13:09
that affect child components that cause unnecessary rerenders and side effects
13:11
unnecessary rerenders and side effects
13:11
unnecessary rerenders and side effects that increase the amount of data you're
13:12
that increase the amount of data you're
13:13
that increase the amount of data you're fetching when you don't need to and then
13:14
fetching when you don't need to and then
13:14
fetching when you don't need to and then we're doing data mapping a lot of effort
13:16
we're doing data mapping a lot of effort
13:16
we're doing data mapping a lot of effort on the client side and if this
13:17
on the client side and if this
13:17
on the client side and if this calculation is really complex which in
13:19
calculation is really complex which in
13:19
calculation is really complex which in this case is not necessarily you may
13:21
this case is not necessarily you may
13:21
this case is not necessarily you may need to memorize which is additional
13:24
need to memorize which is additional
13:24
need to memorize which is additional overhead and again we're not optimizing
13:26
overhead and again we're not optimizing
13:26
overhead and again we're not optimizing the payload for state and caching so
13:29
the payload for state and caching so
13:29
the payload for state and caching so introducing the backend for front end
13:30
introducing the backend for front end
13:30
introducing the backend for front end pattern that's what it stands for it's
13:32
pattern that's what it stands for it's
13:32
pattern that's what it stands for it's not best friend forever uh it's all
13:34
not best friend forever uh it's all
13:34
not best friend forever uh it's all otherwise known as the transformation
13:36
otherwise known as the transformation
13:36
otherwise known as the transformation brid or the agregation proxy and a
13:38
brid or the agregation proxy and a
13:38
brid or the agregation proxy and a couple of key points on this so it adds
13:41
couple of key points on this so it adds
13:41
couple of key points on this so it adds a little bit of adaptability to platform
13:43
a little bit of adaptability to platform
13:43
a little bit of adaptability to platform requirements the BFF pattern allows
13:45
requirements the BFF pattern allows
13:45
requirements the BFF pattern allows developers to create a backend service
13:48
developers to create a backend service
13:48
developers to create a backend service proxy that caters to each platform's
13:50
proxy that caters to each platform's
13:50
proxy that caters to each platform's unique requirements so let's say you've
13:52
unique requirements so let's say you've
13:52
unique requirements so let's say you've got a backend and then you've got a web
13:54
got a backend and then you've got a web
13:54
got a backend and then you've got a web UI you got a mobile UI you can have a
13:55
UI you got a mobile UI you can have a
13:55
UI you got a mobile UI you can have a proxy that sits in the middle whether
13:57
proxy that sits in the middle whether
13:57
proxy that sits in the middle whether it's a nodejs Services spin up or an xjs
13:59
it's a nodejs Services spin up or an xjs
13:59
it's a nodejs Services spin up or an xjs APA endpoint that takes the data from a
14:02
APA endpoint that takes the data from a
14:02
APA endpoint that takes the data from a endpoint that you can't control you can
14:05
endpoint that you can't control you can
14:05
endpoint that you can't control you can then go manipulate the way you want to
14:06
then go manipulate the way you want to
14:06
then go manipulate the way you want to and then it finally arrives on your
14:07
and then it finally arrives on your
14:07
and then it finally arrives on your client side so whatever the data format
14:10
client side so whatever the data format
14:10
client side so whatever the data format you need whatever the interaction
14:11
you need whatever the interaction
14:11
you need whatever the interaction patterns are you can have a more
14:13
patterns are you can have a more
14:13
patterns are you can have a more efficient resource usage you decouple
14:16
efficient resource usage you decouple
14:16
efficient resource usage you decouple the concerns so instead of doing all the
14:17
the concerns so instead of doing all the
14:17
the concerns so instead of doing all the logic on the frontend all the
14:18
logic on the frontend all the
14:18
logic on the frontend all the calculation on the front end you
14:20
calculation on the front end you
14:20
calculation on the front end you delegate the data aggregation so the
14:22
delegate the data aggregation so the
14:22
delegate the data aggregation so the calculation of the birth days the chart
14:23
calculation of the birth days the chart
14:23
calculation of the birth days the chart data and all that transformation task
14:26
data and all that transformation task
14:26
data and all that transformation task takes it's being taken care of on the
14:28
takes it's being taken care of on the
14:28
takes it's being taken care of on the proxy which then if your front end's
14:31
proxy which then if your front end's
14:31
proxy which then if your front end's doing a lot of work a lot of other
14:33
doing a lot of work a lot of other
14:33
doing a lot of work a lot of other calculations you're saving up the amount
14:35
calculations you're saving up the amount
14:35
calculations you're saving up the amount of work that it's doing so you're able
14:37
of work that it's doing so you're able
14:37
of work that it's doing so you're able to offload a couple of tasks somewhere
14:39
to offload a couple of tasks somewhere
14:39
to offload a couple of tasks somewhere else and really keep that that
14:40
else and really keep that that
14:40
else and really keep that that separation of concerns then you optimize
14:43
separation of concerns then you optimize
14:43
separation of concerns then you optimize your performance you can dedicate a back
14:45
your performance you can dedicate a back
14:45
your performance you can dedicate a back end for uh for each front end you can
14:47
end for uh for each front end you can
14:47
end for uh for each front end you can even have a couple of your team members
14:49
even have a couple of your team members
14:49
even have a couple of your team members work on that particular proxy to do
14:52
work on that particular proxy to do
14:52
work on that particular proxy to do additional cap caching optimizations
14:54
additional cap caching optimizations
14:54
additional cap caching optimizations reduce latency manage reducing the
14:57
reduce latency manage reducing the
14:57
reduce latency manage reducing the payload size and this overall increase
14:59
payload size and this overall increase
14:59
payload size and this overall increase performance you're not doing all that
15:00
performance you're not doing all that
15:00
performance you're not doing all that work on the client side so what could we
15:03
work on the client side so what could we
15:03
work on the client side so what could we potentially do we can take that larger
15:05
potentially do we can take that larger
15:05
potentially do we can take that larger payload we receiving and we can condense
15:07
payload we receiving and we can condense
15:07
payload we receiving and we can condense it down to the minimum that we need and
15:09
it down to the minimum that we need and
15:10
it down to the minimum that we need and there's a lot of key value pairs That We
15:11
there's a lot of key value pairs That We
15:11
there's a lot of key value pairs That We Shrunk right there and so what's changed
15:14
Shrunk right there and so what's changed
15:14
Shrunk right there and so what's changed in our architectural diagram if we do
15:16
in our architectural diagram if we do
15:16
in our architectural diagram if we do that if we bring in this BFF pattern so
15:19
that if we bring in this BFF pattern so
15:19
that if we bring in this BFF pattern so if we do this we only make one API
15:22
if we do this we only make one API
15:22
if we do this we only make one API request to an nextjs Handler on the
15:23
request to an nextjs Handler on the
15:23
request to an nextjs Handler on the client side so instead of five we
15:25
client side so instead of five we
15:25
client side so instead of five we initiate our loading State and now our
15:27
initiate our loading State and now our
15:27
initiate our loading State and now our nextjs Handler or whatever microservice
15:29
nextjs Handler or whatever microservice
15:29
nextjs Handler or whatever microservice that sits in the middle is going to be
15:31
that sits in the middle is going to be
15:31
that sits in the middle is going to be the one that makes the five parallel API
15:33
the one that makes the five parallel API
15:33
the one that makes the five parallel API requests it handles then the retry logic
15:37
requests it handles then the retry logic
15:37
requests it handles then the retry logic if need be and if you got a happy poth
15:39
if need be and if you got a happy poth
15:39
if need be and if you got a happy poth you receive the data you map that on the
15:41
you receive the data you map that on the
15:41
you receive the data you map that on the server and the client only receives the
15:43
server and the client only receives the
15:43
server and the client only receives the minimum amount of your payload in order
15:46
minimum amount of your payload in order
15:46
minimum amount of your payload in order to be able to display the UI that you
15:47
to be able to display the UI that you
15:47
to be able to display the UI that you want send the data to the client you set
15:50
want send the data to the client you set
15:50
want send the data to the client you set it in state but then we're still subject
15:52
it in state but then we're still subject
15:52
it in state but then we're still subject to the same problem where if you have
15:54
to the same problem where if you have
15:54
to the same problem where if you have any other uh State updates that happen
15:57
any other uh State updates that happen
15:57
any other uh State updates that happen that are not related you may R
15:59
that are not related you may R
15:59
that are not related you may R re-triggering the entire flow and so the
16:03
re-triggering the entire flow and so the
16:03
re-triggering the entire flow and so the code that we wrote before may be
16:04
code that we wrote before may be
16:05
code that we wrote before may be optimized to just this we're going to
16:07
optimized to just this we're going to
16:07
optimized to just this we're going to just fetch from our proxy and we don't
16:10
just fetch from our proxy and we don't
16:10
just fetch from our proxy and we don't need to do any of the calculations on
16:11
need to do any of the calculations on
16:11
need to do any of the calculations on our end that's a little bit cleaner in
16:14
our end that's a little bit cleaner in
16:14
our end that's a little bit cleaner in terms of the code we don't have to do
16:15
terms of the code we don't have to do
16:15
terms of the code we don't have to do any complications on the client side so
16:19
any complications on the client side so
16:19
any complications on the client side so key factors there's failure isolation
16:21
key factors there's failure isolation
16:21
key factors there's failure isolation here so you're making just a single CL
16:23
here so you're making just a single CL
16:23
here so you're making just a single CL inside request rather than managing
16:25
inside request rather than managing
16:25
inside request rather than managing multiple let the server do that clear
16:28
multiple let the server do that clear
16:28
multiple let the server do that clear codebase manag M payload optimization
16:31
codebase manag M payload optimization
16:31
codebase manag M payload optimization and then you're able to do the data
16:32
and then you're able to do the data
16:32
and then you're able to do the data mapping on the proxy we're going to go
16:34
mapping on the proxy we're going to go
16:34
mapping on the proxy we're going to go for a second demo and we're going to see
16:36
for a second demo and we're going to see
16:36
for a second demo and we're going to see how this affects your application and it
16:38
how this affects your application and it
16:39
how this affects your application and it already loads a little bit faster and
16:40
already loads a little bit faster and
16:40
already loads a little bit faster and we'll see why we'll dig into the network
16:42
we'll see why we'll dig into the network
16:42
we'll see why we'll dig into the network tab if we
16:44
tab if we refresh you'll see that only one API
16:48
refresh you'll see that only one API
16:48
refresh you'll see that only one API request is made and the data comes back
16:50
request is made and the data comes back
16:50
request is made and the data comes back exactly the way we need it for the chart
16:52
exactly the way we need it for the chart
16:52
exactly the way we need it for the chart and we're only loading 666 bytes versus
16:57
and we're only loading 666 bytes versus
16:57
and we're only loading 666 bytes versus what we were before which is a 99 .9%
16:59
what we were before which is a 99 .9%
16:59
what we were before which is a 99 .9% payload reduction now this is uh a
17:03
payload reduction now this is uh a
17:03
payload reduction now this is uh a scenario that was designed to prove a
17:05
scenario that was designed to prove a
17:05
scenario that was designed to prove a point in this talk you're not going to
17:07
point in this talk you're not going to
17:07
point in this talk you're not going to see a 99.9% payload reduction in your
17:10
see a 99.9% payload reduction in your
17:10
see a 99.9% payload reduction in your production applications or of the other
17:12
production applications or of the other
17:12
production applications or of the other uh all the other projects that you work
17:13
uh all the other projects that you work
17:13
uh all the other projects that you work on so you will see some optimization but
17:16
on so you will see some optimization but
17:16
on so you will see some optimization but this is a little bit of an exaggerated
17:18
this is a little bit of an exaggerated
17:18
this is a little bit of an exaggerated case bring in react query or otherwise
17:21
case bring in react query or otherwise
17:21
case bring in react query or otherwise known now as tanack query which is the
17:23
known now as tanack query which is the
17:23
known now as tanack query which is the powerful asynchronous State Management
17:26
powerful asynchronous State Management
17:26
powerful asynchronous State Management uh library and I'm not going to go
17:27
uh library and I'm not going to go
17:27
uh library and I'm not going to go through all the features that something
17:29
through all the features that something
17:29
through all the features that something you can look up on the website yourself
17:30
you can look up on the website yourself
17:30
you can look up on the website yourself but we're going to dig in a little bit
17:31
but we're going to dig in a little bit
17:31
but we're going to dig in a little bit to how this changes our architecture if
17:34
to how this changes our architecture if
17:34
to how this changes our architecture if we bring it in so we know that we have a
17:36
we bring it in so we know that we have a
17:37
we bring it in so we know that we have a user interface we're making one API
17:39
user interface we're making one API
17:39
user interface we're making one API request to nexj from nextjs to the
17:42
request to nexj from nextjs to the
17:42
request to nexj from nextjs to the Handler and now we're using react query
17:44
Handler and now we're using react query
17:44
Handler and now we're using react query to manage that so instead of us using a
17:47
to manage that so instead of us using a
17:47
to manage that so instead of us using a use state or any other state management
17:50
use state or any other state management
17:50
use state or any other state management library to handle the loading State
17:51
library to handle the loading State
17:52
library to handle the loading State react query takes care that for us so we
17:53
react query takes care that for us so we
17:53
react query takes care that for us so we don't need to care about loading then
17:56
don't need to care about loading then
17:56
don't need to care about loading then our proxy is going to make still five
17:57
our proxy is going to make still five
17:57
our proxy is going to make still five API requests nothing changes if
18:00
API requests nothing changes if
18:00
API requests nothing changes if something fails and it comes back as a
18:03
something fails and it comes back as a
18:03
something fails and it comes back as a 500 we return an error code to the
18:05
500 we return an error code to the
18:05
500 we return an error code to the client and we need attempt to retry and
18:08
client and we need attempt to retry and
18:08
client and we need attempt to retry and react query handles that retry on our
18:10
react query handles that retry on our
18:10
react query handles that retry on our behalf so we don't have to have any
18:11
behalf so we don't have to have any
18:11
behalf so we don't have to have any retry logic we don't have to have
18:13
retry logic we don't have to have
18:13
retry logic we don't have to have anything that's tested on our Set uh on
18:15
anything that's tested on our Set uh on
18:15
anything that's tested on our Set uh on our side we don't have to have anything
18:17
our side we don't have to have anything
18:17
our side we don't have to have anything that may be maintained or subject to
18:19
that may be maintained or subject to
18:19
that may be maintained or subject to bugs we are are offloading all that
18:21
bugs we are are offloading all that
18:21
bugs we are are offloading all that retry mechanism to the react career
18:22
retry mechanism to the react career
18:22
retry mechanism to the react career library and it's able to do you know 10
18:25
library and it's able to do you know 10
18:25
library and it's able to do you know 10 retries if you want three you can
18:26
retries if you want three you can
18:27
retries if you want three you can configure those and you can even
18:28
configure those and you can even
18:28
configure those and you can even configure X exponential back off which
18:30
configure X exponential back off which
18:30
configure X exponential back off which is a strategy where instead of doing if
18:34
is a strategy where instead of doing if
18:34
is a strategy where instead of doing if you need to do three retries you do one
18:36
you need to do three retries you do one
18:36
you need to do three retries you do one after the other immediately you're able
18:38
after the other immediately you're able
18:38
after the other immediately you're able to stagger those off and exponential
18:40
to stagger those off and exponential
18:40
to stagger those off and exponential backup allows you to do the first retry
18:42
backup allows you to do the first retry
18:42
backup allows you to do the first retry after one second the second retry after
18:44
after one second the second retry after
18:44
after one second the second retry after two seconds in three seconds and allows
18:46
two seconds in three seconds and allows
18:46
two seconds in three seconds and allows to give more grace for the server to
18:48
to give more grace for the server to
18:48
to give more grace for the server to recover itself before it returns
18:49
recover itself before it returns
18:49
recover itself before it returns anything and that adds a bit more
18:50
anything and that adds a bit more
18:50
anything and that adds a bit more resilience to the
18:52
resilience to the application now moving back to if we get
18:55
application now moving back to if we get
18:55
application now moving back to if we get to the Happy path we do the same thing
18:57
to the Happy path we do the same thing
18:57
to the Happy path we do the same thing we map the data on the server we send it
18:59
we map the data on the server we send it
18:59
we map the data on the server we send it to the client react quy derives the
19:02
to the client react quy derives the
19:02
to the client react quy derives the state from the response we render the UI
19:04
state from the response we render the UI
19:04
state from the response we render the UI and now if a react State change
19:07
and now if a react State change
19:07
and now if a react State change happens react query actually checks its
19:09
happens react query actually checks its
19:09
happens react query actually checks its inmemory cach or we can configure
19:11
inmemory cach or we can configure
19:11
inmemory cach or we can configure offline cash if there's a cash hit we go
19:14
offline cash if there's a cash hit we go
19:14
offline cash if there's a cash hit we go straight to rendering the UI and we have
19:16
straight to rendering the UI and we have
19:16
straight to rendering the UI and we have no side effects we're not re redoing all
19:18
no side effects we're not re redoing all
19:18
no side effects we're not re redoing all the requests it is a cash hit or if it's
19:21
the requests it is a cash hit or if it's
19:21
the requests it is a cash hit or if it's stale like it's been sitting for five
19:23
stale like it's been sitting for five
19:23
stale like it's been sitting for five minutes and we declare after five
19:24
minutes and we declare after five
19:24
minutes and we declare after five minutes the data is no longer useful for
19:25
minutes the data is no longer useful for
19:26
minutes the data is no longer useful for us we want to have a fresh set of data
19:27
us we want to have a fresh set of data
19:27
us we want to have a fresh set of data then we restart the the journey and
19:29
then we restart the the journey and
19:29
then we restart the the journey and we're reducing the load and impact on
19:31
we're reducing the load and impact on
19:31
we're reducing the load and impact on that server so the code moved from our
19:34
that server so the code moved from our
19:34
that server so the code moved from our custom uh hook that we created and now
19:37
custom uh hook that we created and now
19:37
custom uh hook that we created and now we're using use Query which is a hook
19:39
we're using use Query which is a hook
19:39
we're using use Query which is a hook that comes directly from the gra query
19:41
that comes directly from the gra query
19:41
that comes directly from the gra query library and we simplify that code quite
19:43
library and we simplify that code quite
19:43
library and we simplify that code quite a bit what's interesting also even if we
19:45
a bit what's interesting also even if we
19:45
a bit what's interesting also even if we did custom uh cach management with local
19:47
did custom uh cach management with local
19:47
did custom uh cach management with local storage in our in our custom solution
19:50
storage in our in our custom solution
19:50
storage in our in our custom solution one of the really complicated things
19:51
one of the really complicated things
19:51
one of the really complicated things with that is um managing query
19:54
with that is um managing query
19:54
with that is um managing query parameters with caching so you can see
19:57
parameters with caching so you can see
19:57
parameters with caching so you can see that there query Keys over here that we
19:58
that there query Keys over here that we
19:58
that there query Keys over here that we put to react query for the same endpoint
20:01
put to react query for the same endpoint
20:01
put to react query for the same endpoint I may have different query keys that I
20:03
I may have different query keys that I
20:03
I may have different query keys that I provide so I may Fetch with a Quant
20:05
provide so I may Fetch with a Quant
20:05
provide so I may Fetch with a Quant quantity of five this time another time
20:07
quantity of five this time another time
20:07
quantity of five this time another time a quantity of 10 a quantity of one and
20:10
a quantity of 10 a quantity of one and
20:10
a quantity of 10 a quantity of one and this allows you to store Cash for every
20:13
this allows you to store Cash for every
20:13
this allows you to store Cash for every single query paramet com uh combination
20:16
single query paramet com uh combination
20:16
single query paramet com uh combination for a particular endpoint that's also a
20:18
for a particular endpoint that's also a
20:18
for a particular endpoint that's also a lot easier to
20:19
lot easier to manage so quickly to wrap this up key
20:22
manage so quickly to wrap this up key
20:22
manage so quickly to wrap this up key factors with the library we have uh all
20:25
factors with the library we have uh all
20:25
factors with the library we have uh all these things handled for us makes life a
20:27
these things handled for us makes life a
20:27
these things handled for us makes life a lot easier we've got caching we limit
20:30
lot easier we've got caching we limit
20:30
lot easier we've got caching we limit rerenders and we add a little bit out of
20:32
rerenders and we add a little bit out of
20:32
rerenders and we add a little bit out of the box configurable
20:34
the box configurable
20:34
the box configurable resilience but there's one more thing we
20:36
resilience but there's one more thing we
20:36
resilience but there's one more thing we can do and we can bring this over to the
20:38
can do and we can bring this over to the
20:38
can do and we can bring this over to the server so now we're still doing all this
20:40
server so now we're still doing all this
20:40
server so now we're still doing all this work on the client we're hitting that
20:41
work on the client we're hitting that
20:41
work on the client we're hitting that proxy server but in nextjs you're
20:44
proxy server but in nextjs you're
20:44
proxy server but in nextjs you're actually allowed to pre-render or
20:47
actually allowed to pre-render or
20:47
actually allowed to pre-render or pre-etch a couple of things before it
20:49
pre-etch a couple of things before it
20:49
pre-etch a couple of things before it sends the skeleton HT HTML over to your
20:52
sends the skeleton HT HTML over to your
20:52
sends the skeleton HT HTML over to your client and starts to hydrate with all
20:54
client and starts to hydrate with all
20:54
client and starts to hydrate with all the JavaScript so this time instead of a
20:57
the JavaScript so this time instead of a
20:57
the JavaScript so this time instead of a UI we've actually got a ser
20:59
UI we've actually got a ser
20:59
UI we've actually got a ser so we're on the server and the server
21:00
so we're on the server and the server
21:00
so we're on the server and the server will prefetch an attempt to prefetch the
21:03
will prefetch an attempt to prefetch the
21:03
will prefetch an attempt to prefetch the data from the
21:04
data from the proxy then it comes back with the data
21:07
proxy then it comes back with the data
21:07
proxy then it comes back with the data let's say it's the happy path it will
21:09
let's say it's the happy path it will
21:09
let's say it's the happy path it will then send that Frozen uh dehydrated
21:12
then send that Frozen uh dehydrated
21:12
then send that Frozen uh dehydrated State over to the client's side
21:14
State over to the client's side
21:14
State over to the client's side immediately Hydrate with react query and
21:17
immediately Hydrate with react query and
21:17
immediately Hydrate with react query and then you actually won't even have any
21:19
then you actually won't even have any
21:19
then you actually won't even have any loading State initially you'll directly
21:20
loading State initially you'll directly
21:20
loading State initially you'll directly render a UI so this way you can actually
21:23
render a UI so this way you can actually
21:23
render a UI so this way you can actually get an even faster response however in
21:25
get an even faster response however in
21:25
get an even faster response however in scenarios like this some more problems
21:28
scenarios like this some more problems
21:28
scenarios like this some more problems come into play you actually have uh an
21:31
come into play you actually have uh an
21:31
come into play you actually have uh an issue where if your request takes too
21:34
issue where if your request takes too
21:34
issue where if your request takes too long to come back your uh your UI will
21:38
long to come back your uh your UI will
21:38
long to come back your uh your UI will actually never receive its initial HTML
21:40
actually never receive its initial HTML
21:40
actually never receive its initial HTML your browser will never receive it so if
21:42
your browser will never receive it so if
21:42
your browser will never receive it so if your um if your request is taking two
21:45
your um if your request is taking two
21:45
your um if your request is taking two seconds to arrive your user will wait
21:46
seconds to arrive your user will wait
21:46
seconds to arrive your user will wait two seconds on a blank page before
21:48
two seconds on a blank page before
21:48
two seconds on a blank page before actually seeing anything this is where
21:50
actually seeing anything this is where
21:50
actually seeing anything this is where we can Implement a timeout control where
21:52
we can Implement a timeout control where
21:52
we can Implement a timeout control where okay I'm trying in an optimal scenario
21:54
okay I'm trying in an optimal scenario
21:54
okay I'm trying in an optimal scenario to get the information and prefetch it
21:56
to get the information and prefetch it
21:56
to get the information and prefetch it as soon as possible let's it comes back
21:59
as soon as possible let's it comes back
21:59
as soon as possible let's it comes back in 10 milliseconds fantastic but if it
22:00
in 10 milliseconds fantastic but if it
22:00
in 10 milliseconds fantastic but if it takes over a th000 milliseconds bail
22:03
takes over a th000 milliseconds bail
22:03
takes over a th000 milliseconds bail don't bother making that optimization
22:06
don't bother making that optimization
22:06
don't bother making that optimization and try to show the UI hydrated and
22:08
and try to show the UI hydrated and
22:08
and try to show the UI hydrated and continue fetching on the client side so
22:11
continue fetching on the client side so
22:11
continue fetching on the client side so over here you can build your system to
22:13
over here you can build your system to
22:13
over here you can build your system to be able to in its most ideal scenario
22:16
be able to in its most ideal scenario
22:16
be able to in its most ideal scenario fetch the data as fast as possible but
22:18
fetch the data as fast as possible but
22:19
fetch the data as fast as possible but then if something goes wrong or it takes
22:20
then if something goes wrong or it takes
22:20
then if something goes wrong or it takes too long you have back you have backups
22:23
too long you have back you have backups
22:23
too long you have back you have backups and you have fall backs that happen and
22:24
and you have fall backs that happen and
22:24
and you have fall backs that happen and then everything tries to recover on the
22:26
then everything tries to recover on the
22:26
then everything tries to recover on the client side so it adds a lot of
22:28
client side so it adds a lot of
22:28
client side so it adds a lot of resilience
22:29
resilience and this is what the code looks like to
22:30
and this is what the code looks like to
22:30
and this is what the code looks like to be able to do so I'm not going to dig
22:32
be able to do so I'm not going to dig
22:32
be able to do so I'm not going to dig too much into that I'm happy to share
22:33
too much into that I'm happy to share
22:33
too much into that I'm happy to share the slides later on but I also
22:35
the slides later on but I also
22:35
the slides later on but I also appreciate I'm running out of time
22:36
appreciate I'm running out of time
22:36
appreciate I'm running out of time slowly and Simon might stop me so what
22:40
slowly and Simon might stop me so what
22:40
slowly and Simon might stop me so what happens with the server side uh now that
22:42
happens with the server side uh now that
22:43
happens with the server side uh now that we're moving things to the server side
22:44
we're moving things to the server side
22:44
we're moving things to the server side we're able to prefetch so we're reducing
22:45
we're able to prefetch so we're reducing
22:45
we're able to prefetch so we're reducing the UI delay so we're showing things a
22:48
the UI delay so we're showing things a
22:48
the UI delay so we're showing things a little bit sooner we're also load
22:49
little bit sooner we're also load
22:49
little bit sooner we're also load balancing a little bit so if your
22:51
balancing a little bit so if your
22:51
balancing a little bit so if your client's going to be making a lot of
22:52
client's going to be making a lot of
22:52
client's going to be making a lot of requests you can make a couple on the
22:54
requests you can make a couple on the
22:54
requests you can make a couple on the server beforehand so that you're
22:55
server beforehand so that you're
22:55
server beforehand so that you're reducing the amount of work that the
22:57
reducing the amount of work that the
22:57
reducing the amount of work that the client's doing at a network level too
22:59
client's doing at a network level too
22:59
client's doing at a network level too you also have data recovery strategies
23:01
you also have data recovery strategies
23:01
you also have data recovery strategies which means that if your server fails to
23:04
which means that if your server fails to
23:04
which means that if your server fails to prefetch and times up your client side's
23:07
prefetch and times up your client side's
23:07
prefetch and times up your client side's going to start recovering for you and
23:09
going to start recovering for you and
23:09
going to start recovering for you and then also another concept that you can
23:11
then also another concept that you can
23:11
then also another concept that you can introduce is query criticality which is
23:14
introduce is query criticality which is
23:14
introduce is query criticality which is if on the server I'm trying to fetch a
23:16
if on the server I'm trying to fetch a
23:16
if on the server I'm trying to fetch a set of data and it doesn't arrive and I
23:20
set of data and it doesn't arrive and I
23:20
set of data and it doesn't arrive and I deem that it's not critical to show to
23:22
deem that it's not critical to show to
23:22
deem that it's not critical to show to my UI you can just bail and fetch that
23:26
my UI you can just bail and fetch that
23:26
my UI you can just bail and fetch that data in a uh in a deer prioritize manner
23:29
data in a uh in a deer prioritize manner
23:29
data in a uh in a deer prioritize manner on the client side so an example of this
23:31
on the client side so an example of this
23:31
on the client side so an example of this may be that we have the um chart data
23:35
may be that we have the um chart data
23:35
may be that we have the um chart data that we're showing what's crucial for
23:37
that we're showing what's crucial for
23:37
that we're showing what's crucial for the chart data is to receive back all
23:39
the chart data is to receive back all
23:39
the chart data is to receive back all that data from the 5,000 users and
23:40
that data from the 5,000 users and
23:40
that data from the 5,000 users and that's a critical query if it's critical
23:43
that's a critical query if it's critical
23:43
that's a critical query if it's critical and I'm trying to prefetch it and the
23:45
and I'm trying to prefetch it and the
23:45
and I'm trying to prefetch it and the data doesn't come back I can show some
23:46
data doesn't come back I can show some
23:46
data doesn't come back I can show some error States on my UI if I mark it as an
23:49
error States on my UI if I mark it as an
23:49
error States on my UI if I mark it as an optional query and the query is not
23:52
optional query and the query is not
23:52
optional query and the query is not crucial to show to my UI I can say that
23:56
crucial to show to my UI I can say that
23:56
crucial to show to my UI I can say that bail request show the UI and you can
23:59
bail request show the UI and you can
23:59
bail request show the UI and you can attempt to later on try to refetch the
24:02
attempt to later on try to refetch the
24:02
attempt to later on try to refetch the data because it's not something I need
24:03
data because it's not something I need
24:03
data because it's not something I need to immediately show to the users so you
24:05
to immediately show to the users so you
24:05
to immediately show to the users so you can decide what's more important to
24:07
can decide what's more important to
24:07
can decide what's more important to fetch and what's less important to fetch
24:09
fetch and what's less important to fetch
24:09
fetch and what's less important to fetch and we're going to go for a final demo
24:10
and we're going to go for a final demo
24:10
and we're going to go for a final demo to see how this all ties together and
24:12
to see how this all ties together and
24:12
to see how this all ties together and hopefully makes sense so this time we're
24:14
hopefully makes sense so this time we're
24:14
hopefully makes sense so this time we're reloading and it comes really really
24:17
reloading and it comes really really
24:17
reloading and it comes really really fast we click the hide and show button
24:19
fast we click the hide and show button
24:19
fast we click the hide and show button and because everything's being cached we
24:22
and because everything's being cached we
24:22
and because everything's being cached we show things nearly immediately because
24:24
show things nearly immediately because
24:24
show things nearly immediately because deriving from the cash and not doing
24:26
deriving from the cash and not doing
24:26
deriving from the cash and not doing another fetch we're showing also example
24:28
another fetch we're showing also example
24:28
another fetch we're showing also example of how the timeouts work so over here
24:31
of how the timeouts work so over here
24:31
of how the timeouts work so over here there was no timeout that initiate uh
24:33
there was no timeout that initiate uh
24:33
there was no timeout that initiate uh there was a timeout that initiated the
24:35
there was a timeout that initiated the
24:35
there was a timeout that initiated the request took too long to come back I'll
24:38
request took too long to come back I'll
24:38
request took too long to come back I'll quickly Replay that actually uh the
24:40
quickly Replay that actually uh the
24:41
quickly Replay that actually uh the request took too long to come back so it
24:42
request took too long to come back so it
24:42
request took too long to come back so it actually tried to recover on the client
24:44
actually tried to recover on the client
24:44
actually tried to recover on the client side but when it happens in an ideal
24:46
side but when it happens in an ideal
24:46
side but when it happens in an ideal scenario like the first time you can see
24:50
scenario like the first time you can see
24:50
scenario like the first time you can see that there was no client side request
24:52
that there was no client side request
24:52
that there was no client side request initially so over here everything was
24:54
initially so over here everything was
24:54
initially so over here everything was prefresh on the server if we look later
24:56
prefresh on the server if we look later
24:56
prefresh on the server if we look later on and we where we adjusted the timeout
24:59
on and we where we adjusted the timeout
24:59
on and we where we adjusted the timeout um I'll look a little bit in the code we
25:00
um I'll look a little bit in the code we
25:00
um I'll look a little bit in the code we just the timeout it has to come back
25:02
just the timeout it has to come back
25:02
just the timeout it has to come back within uh 500 milliseconds I think we
25:06
within uh 500 milliseconds I think we
25:06
within uh 500 milliseconds I think we said um let me play that from here 500
25:10
said um let me play that from here 500
25:10
said um let me play that from here 500 milliseconds the request isn't isn't
25:12
milliseconds the request isn't isn't
25:12
milliseconds the request isn't isn't able to do that so now the application
25:15
able to do that so now the application
25:15
able to do that so now the application bails and tries to recover with the
25:16
bails and tries to recover with the
25:16
bails and tries to recover with the loading State on the client side so this
25:18
loading State on the client side so this
25:18
loading State on the client side so this is making our application a lot faster a
25:21
is making our application a lot faster a
25:21
is making our application a lot faster a lot more responsive a lot more resilient
25:22
lot more responsive a lot more resilient
25:22
lot more responsive a lot more resilient and optimized to make so to make sure
25:24
and optimized to make so to make sure
25:24
and optimized to make so to make sure it's doing the least amount of requests
25:27
it's doing the least amount of requests
25:27
it's doing the least amount of requests to show the UI that we want to
25:29
to show the UI that we want to
25:29
to show the UI that we want to show and that's pretty much it that's a
25:31
show and that's pretty much it that's a
25:31
show and that's pretty much it that's a lot