Skip to main content

9 posts tagged with "web"

View All Tags

Why PWA?

· 2 min read

PWA means Progressive Web Apps. Back in 2014, the W3C published a draft of Service Worker, and then in 2015, Chrome supported it in the production environment. If we take the emergence of Service Worker, one of PWA's core technologies, as the start point of PWA, PWA's birth year is 2015. Before focusing on what a PWA is, let's first understand why we need it.

  1. User experience. Back to 2015, frontend developers spend a lot of time optimizing the web by speeding up the rendering of the initial page, making the animation smoother, etc. However, the native app still wins regarding the user experience.

  2. User retention. Native apps can be pinned onto the mobile phone's home screen and bring the users back into the app by notifications, while the web apps cannot.

  3. Leveraging device APIs. Android and iOS provide abundant device APIs that native apps can easily use with the user's permission. However, back then, the browser does not fully support them.

Google's tutorial of Why Build Progressive Web Apps summarizes the problem as "Broad Reach, Low Engagement".

UV and user duration comparison between web sites and native apps

To tackle the disadvantages of web apps in the mobile age, PWA comes into being.

What is PWA?

· 3 min read

When Google came up with the PWA, it didn't have a precise definition. It is not a specific technology, but a combination of techniques to improve the user experience. Those technologies are Web App Manifest, Service Worker, Web Push, etc.

The main features of PWA are as follows.

  • Reliability - instant loading, even in unstable or disconnected network environments.
  • User experience - rapid response, with smooth transition animations and feedback on user actions
  • Stickiness - like the Native App, can be added to the home screen and receive offline notifications.

PWA itself emphasizes Progressive (Progressive) in two perspectives.

  1. PWA is still evolving;
  2. PWA is downward-compatible and non-intrusive. It costs developers little to use the new features - developers can add it to the existing site progressively.

Google's "Progressive Web App Checklist" defines those minimum requirements for PWA.

  • served over HTTPS.
  • Pages should be responsive on desktop, tablet, and mobile devices
  • All URLs have content to show in case of disconnection, not the default browser page
  • requires Web App Manifest to be added to the desktop
  • Faster page loading and shorter delay, even on 3G networks
  • It displays correctly in all major browsers
  • Smooth animation with immediate feedback
  • Each page has its own URL

Features of PWA

A PWA combines the benefits of both the Web App and Native App and gives us the following features.

  • Progressive - for all browsers, as it is developed with progressive enhancements in mind
  • Connectivity agnostic - Ability to leverage Service Worker for offline or low network connectivity.
  • Native experiences - on the App Shell model, they should have Native App interactions.
  • Continued updates -- always up-to-date, no version or update issues.
  • Security - Served over HTTPS
  • Indexable - manifest files and Service Workers can be recognized and indexed by the search engine.
  • Stickiness - By pushing offline notifications, etc., you can get users back to your app.
  • Installable - users can easily add web apps to the home or desktop without going to an app store.
  • Linkable - share contents through links without downloading and installing them.

More specifically, what is the advantage of PWA over the native app? Openness and index-ability. Users can hardly install a native app instantly and search across native apps seamlessly.

The table below shows the comparison between t raditional Web App, Native App, and PWA for each feature.

InstallableLinkableUser experienceUser stickiness
Traditional Web
Native App😐✅️
PWA

Rewriting Facebook.com

· 3 min read

Facebook has evolved from its initial PHP server-side rendered website over the past 16 years. The external environment for web development has changed dramatically, and the cost of developing new features on the old architecture is increasing. To achieve an "app-like experience" and outstanding performance, they rewrote the entire main website using React and Relay, based on two principles — "as little as possible, as early as possible" and "enhancing developer experience to serve user experience."

Applying these two principles to four main elements (CSS, JavaScript, data, navigation) yields the following insights:

  1. Improve CSS
    1. Atomic CSS: Using atomic class CSS generated at build time reduced the homepage's CSS by 80% — because the number of entries in this CSS approaches log N — the total amount of styles grows with unique styles rather than the number of styles and features written in the code. We at Uber use Styletron for this purpose.
    2. CSS-in-JavaScript: Using methods like stylex.create({}) to generate styles for components, colocating them with the components to enhance removability and make styles easier to maintain.
    3. Consistently use rem for better scaling experience, automatically converting px to rem at build time.
    4. Implement dark mode using CSS variables.
    5. Use inline SVG in HTML as icons to solve the flash issue of delayed icon rendering.
  2. Split JavaScript to optimize performance
    1. Incrementally load code, breaking down 500 KB of total code into 50 KB Tier 1, 150 KB Tier 2, and 300 KB Tier 3, corresponding to skeleton content, initial screen content, and non-UI content, loading them as needed.
    2. Load experimental code only when necessary.
    3. Load corresponding code based on data, such as loading image components for image data and video components for video data.
    4. Budget the size of JavaScript and strictly monitor changes in code size.
  3. Load data as early as possible
    1. Preloading: Using Relay to immediately know the data needed for the initial screen, streaming this data while downloading the code.
    2. Reduce round trips using streaming.
    3. Delay loading data that is not currently needed.
  4. Define route mapping to accelerate navigation
    1. Obtain route definitions as early as possible.
    2. Pre-fetch resources early, starting when hovering or focusing. After navigation changes, if loading is not complete, retain the current page using React Suspense transitions until loading is finished before switching. This keeps the experience consistent with standard browser behavior.
    3. Download code and data in parallel. Typically, code is downloaded first followed by data, which is serial; Facebook allows data and code to download simultaneously in one round trip.

Web App Delivery Optimization

· 4 min read

Two golden rules: minimize 1) latency 2) payload

To Minimize Latency…

  • Reduce DNS lookups

    • Use a Fast DNS Provider, AVG Res Time (cloud flare < DNS Made Easy < AWS Route 53 < GoDaddy < NameCheap). NOTE: results vary in certain regions
    • DNS Cache. TTL Tradeoff = perf <> up-to-dateness
    • reduce number of 3p domains or use services with fast DNS (conflicts with domain sharding optimization for HTTP1)
    • ==DNS prfetching== <link rel="dns-prefetch" href="//www.example.com/" >
  • reuse TCP connections.

  • minimize number of HTTP redirects

  • use a CDN

    • E.g. Netflix dev their own hardware and cooperate w/ local ISPs to serve CDN
  • eliminate unnecessary resources

  • cache resources on the client

    1. HTTP cache headers
      • cache-control for max-age
        • Note, for JS files : A simple way to ensure the browser picks up changed is by using output.filename substitutions with hashes. Webpack Caching
      • expires
        • If both Expires and max-age are set max-age will take precedence.
    2. last-modified, ETag headers to validate if the resource has been updated since we last load it
      • time-based Last-Modified response header (not used often because nginx and microservices)
      • content-based ETag (Entity Tag)
        • This tag is useful when for when the last modified date is difficult to determine.
        • done by hashing
    3. a common mistake is to set only one of the two above
  • compress assets during transfer

    • use JPEG, WebP instead of PNG
    • HTTP2 compresses headers automatically
    • nginx gzipped

To minimize Payload…

  • eliminate unnecessary request bytes

    • especially for cookies
      • even though HTTP standard does not specify a size limit on the headers / cookie, but browsers / servers often enforce …
        • 4KB limit on cookies
        • 8KB ~ 16 KB limit on headers
      • cookies are attached in every request
  • parallelize request and response processing

    • while browser is blocked on resources, preload scanner looks ahead and dispatch downloads in advance: ~20% improvement

Applying protocol-specific optimizations

  • HTTP 1.x

    • use HTTP keepalive and HTTP pipelining: dispatch multiple requests, in parallel, over the same connection, without waiting for a response in serial fashion.
    • browsers could only open a limited number of connections to a particular domain, so …
      • domain sharding = more origins * 6 connections per origin (DNS lookups may introduce more latencies)
      • bundle resources to reduce HTTP requests
      • inline small resource
  • HTTP 2.X

    • With binary framing layer introduced, we get one connection per origin with multiplexing/steam prioritization/flow control/server push, so remove 1.x optimization…
      • remove unnecessary concatenation and image splitting
      • use server push: previously inlined resources can be pushed and cached.

Tools

References

How to Get Lucky?

· 2 min read

Self-fulfilling prophecy: Believe that you are lucky, and you will be lucky because you will be more willing to explore opportunities, accept challenges, and persevere; rather than just feeling happy about good outcomes.

Lucky people have three personality traits:

  1. Extroverted
  2. Open-minded
  3. Relaxed (low neuroticism)

How to create good luck?

  1. Participate in new activities and experience new things
  2. Trust your intuition and interests
  3. Be optimistic, when playing soccer, take a few more shots, and you'll eventually score
  4. Be good at finding the silver lining in bad situations

What is the essence of good luck?

Creating opportunities, discovering opportunities, and having the courage to act

What does this have to do with mindfulness?

Mindfulness = stepping out of your subjective perspective and noticing the value of things around you

Buddhist perspective:

  • We oppose the "law of attraction" because fantasizing about good outcomes is actually a form of "greed."
  • We advocate for a systematic approach to probability, which is essentially "form is emptiness."
  • We encourage discovering good luck outside of routine tasks, which actually avoids "delusion."
  • We use holistic thinking to break normalizing biases, which is practicing "no-self."
  • So who says Buddhism is useless? If you truly excel in Buddhism, you wouldn't need to seek the Buddha for trivial matters like promotion and fortune.

Debounce, Throttle and RequestAnimationFrame

· One min read

These are web techniques to optimize UI events handling and make transitions smoother.

  • debounce: Grouping a sudden burst of events (like keystrokes) into a single one.
  • throttle: Guaranteeing a constant flow of executions every X milliseconds. Like checking every 200ms your scroll position to trigger a CSS animation.
  • requestAnimationFrame: a throttle alternative. When your function recalculates and renders elements on screen and you want to guarantee smooth changes or animations. Note: no IE9 support.

What is the difference between debounce and throttle? try here

PWA for Mobile Web

· 2 min read

Why Progressive Web App?

  • can be put to home screen by Chrome and Safari
  • work offline with service workers
  • increase engagement with push notification
  • improve the conversion rate for new users across all browsers by 104% and on iOS by 82%

“Progressive” means the improvement is not binary and terminal but evolutionary.

What is it?

PWA = website optimized for mobile + manifest.json + service worker loading and registering

How to add it to your site?

manifest.json is the easy part. Put the following into the example.com/manifest.json

{
"short_name": "Short",
"name": "Longer Name",
"icons": [
{
"src": "favicon.png",
"sizes": "192x192 150x150 144x144 64x64 32x32 24x24 16x16",
"type": "image/png"
}
],
"start_url": "/",
"display": "standalone",
"theme_color": "#de4c4f",
"background_color": "#f3f3f3"
}

And add the following into html <head>

<link rel="manifest" href="/manifest.json"/>
<link rel="apple-touch-icon" href="/favicon.png"/>

Then on both iOS and Android, users can add the site to the home screen.

Then ... service worker loading and registering

The loading part I recommend create-react-app’s service worker loading script, which has good security practices and targets the cache first strategy. And it includes unregister as well.

The registering part is trickier - we added the following webpack plugin to prepare the service-worker.js.

// ...
plugins: [
// ...
new SWPrecacheWebpackPlugin(
{
mergeStaticsConfig: true,
dontCacheBustUrlsMatching: /\.\w{8}\./,
filename: 'service-worker.js',
minify: false,
navigateFallback: '/',
navigateFallbackWhitelist: [/^(?!\/__).*/],
staticFileGlobs: [
`${OUTPUT_DIR}/**`,
],
stripPrefix: OUTPUT_DIR,
staticFileGlobsIgnorePatterns: [/\.map$/, /asset-manifest\.json$/],
dynamicUrlToDependencies: {
'/index.html': glob.sync(path.resolve(`${OUTPUT_DIR}/**/*.js`)),
},
}
),
],
// ...

The tricky part here is that if you have SSR as we do - be careful to specify the dynamicUrlToDependencies; otherwise, cache may fail to be updated.

Designing very large (JavaScript) applications

· 3 min read

Very Large JS App = a lot of developers + large codebase

How to deal with a lot of developers?

empathy

What is a ==senior engineer==? A team of senior engineers without junior engineers is a team of engineers

  1. being senior means is that I’d be able to solve almost every problem that somebody might throw at me.
  2. make the junior engineers eventually be senior engineers.

what’s the next step of a senior engineer?

  1. senior: “I know how I would solve the problem” and because I know how I would solve it I could also teach someone else to do it.
  2. next level: “I know how others would solve the problem “

good programming model

how people write software, e.g. react/redux, npm. Here comes a model that affects all large JS apps - code splitting.

  1. People have to think what to bundle and when to load
  2. ==route based code splitting==
  3. But, what if it is not enough?
    1. lazy loaded every single component of our website
    2. How does Google do? split them by rendering logic, and by application logic. ==simply server side render a page, and then whatever was actually rendered, triggers downloading the associated application bundles.== Google does not do isomorphic rendering - no double rendering

How to deal with a large codebase?

==Code Removability/Delete-ability==

e.g. CSS is bad in code removability

  1. one big css file. There is this selector in there. Who really knows whether that still matches anything in your app? So, you end up just keeping it there.
  2. People thus created CSS-in-JS

==avoid central configuration of your application at all cost==

  1. Bad example
    1. central routes configuration
    2. central webpack.config.js
  2. Good example
    1. decentralized package.json

avoid central import problem: router imports component A, B, and C

  1. to solve this problem, do ==“enhance” instead of “import”==
  2. However, developers still have to decide when to enhance and when to import. Since this might lead to very bad situations, we make “enhance” illegal, nobody gets to use it–with one exception: generated code.

avoid base bundle pile of trash

  1. e.g. base bundle should never contain UI code
  2. Solve this problem with forbidden dependency tests
  3. ==most straight forward way must be the right way; otherwise add a test that ensures the right way.==

Be careful with abstractions

We have to become good at finding the right abstractions: Empathy and experience -> Right abstractions

CORS vs CSP

· One min read
  • CORS allows a site A (data provider) to give permission to site B to read (potentially private) data from site A (using the visitor's browser and credentials).
  • CSP allows a site to prevent itself (data consumer) from loading (potentially malicious) content from unexpected sources (e.g. as a defence against XSS).