prettier markdown

This commit is contained in:
Manav Rathi 2024-04-03 13:33:42 +05:30
parent 212dcfb88a
commit ad6dea2ecb
No known key found for this signature in database
11 changed files with 201 additions and 154 deletions

View file

@ -1,3 +1,2 @@
thirdparty/
public/
*.md

View file

@ -1,5 +1,6 @@
{
"tabWidth": 4,
"proseWrap": "always",
"plugins": [
"prettier-plugin-organize-imports",
"prettier-plugin-packagejson"

View file

@ -4,8 +4,8 @@ Source code for Ente's various web apps and supporting websites.
Live versions are at:
* Ente Photos: [web.ente.io](https://web.ente.io)
* Ente Auth: [auth.ente.io](https://auth.ente.io)
- Ente Photos: [web.ente.io](https://web.ente.io)
- Ente Auth: [auth.ente.io](https://auth.ente.io)
To know more about Ente, see [our main README](../README.md) or visit
[ente.io](https://ente.io).
@ -49,8 +49,8 @@ For more details about development workflows, see [docs/dev](docs/dev.md).
As a brief overview, this directory contains the following apps:
* `apps/photos`: A fully functional web client for Ente Photos.
* `apps/auth`: A view only client for Ente Auth. Currently you can only view
- `apps/photos`: A fully functional web client for Ente Photos.
- `apps/auth`: A view only client for Ente Auth. Currently you can only view
your 2FA codes using this web app. For adding and editing your 2FA codes,
please use the Ente Auth [mobile/desktop app](../auth/README.md) instead.
@ -58,8 +58,8 @@ These two are the public facing apps. There are other part of the code which are
accessed as features within the main apps, but in terms of code are
independently maintained and deployed:
* `apps/accounts`: Passkey support (Coming soon)
* `apps/cast`: Chromecast support (Coming soon)
- `apps/accounts`: Passkey support (Coming soon)
- `apps/cast`: Chromecast support (Coming soon)
> [!NOTE]
>
@ -81,12 +81,12 @@ City coordinates from [Simple Maps](https://simplemaps.com/data/world-cities)
[![Crowdin](https://badges.crowdin.net/ente-photos-web/localized.svg)](https://crowdin.com/project/ente-photos-web)
If you're interested in helping out with translation, please visit our [Crowdin
project](https://crowdin.com/project/ente-photos-web) to get started. Thank you
for your support.
If you're interested in helping out with translation, please visit our
[Crowdin project](https://crowdin.com/project/ente-photos-web) to get started.
Thank you for your support.
If your language is not listed for translation, please [create a GitHub
issue](https://github.com/ente-io/ente/issues/new?title=Request+for+New+Language+Translation&body=Language+name%3A)
If your language is not listed for translation, please
[create a GitHub issue](https://github.com/ente-io/ente/issues/new?title=Request+for+New+Language+Translation&body=Language+name%3A)
to have it added.
## Contribute

View file

@ -20,6 +20,7 @@ Add the following to `web/apps/photos/.env.local`:
NEXT_PUBLIC_ENTE_ENDPOINT = http://localhost:8080
NEXT_PUBLIC_ENTE_PAYMENTS_ENDPOINT = http://localhost:3001
```
Then start it locally
```sh
@ -35,8 +36,8 @@ This tells it to connect to the museum and payments app running on localhost.
### Payments app
For this (payments) web app, configure it to connect to the local museum, and
use a set of (development) Stripe keys which can be found in [Stripe's developer
dashboard](https://dashboard.stripe.com).
use a set of (development) Stripe keys which can be found in
[Stripe's developer dashboard](https://dashboard.stripe.com).
Add the following to `web/apps/payments/.env.local`:
@ -58,8 +59,8 @@ yarn dev:payments
2. Define this secret within your `musuem.yaml`
3. Update the `whitelisted-redirect-urls` so that it supports redirecting to
the locally running payments app.
3. Update the `whitelisted-redirect-urls` so that it supports redirecting to the
locally running payments app.
Assuming that your local payments app is running on `localhost:3001`, your
`server/museum.yaml` should look as follows.
@ -69,7 +70,9 @@ stripe:
us:
key: stripe_dev_key
webhook-secret: stripe_dev_webhook_secret
whitelisted-redirect-urls: ["http://localhost:3000/gallery", "http://192.168.1.2:3001/frameRedirect"]
whitelisted-redirect-urls:
- "http://localhost:3000/gallery"
- "http://192.168.1.2:3001/frameRedirect"
path:
success: ?status=success&session_id={CHECKOUT_SESSION_ID}
cancel: ?status=fail&reason=canceled
@ -80,7 +83,7 @@ Make sure you have test plans available for museum to use, by placing them in
Finally, start museum, for example:
```
```sh
docker compose up
```

View file

@ -5,20 +5,20 @@
These are some global dev dependencies in the root `package.json`. These set the
baseline for how our code be in all the workspaces in this (yarn) monorepo.
* "prettier" - Formatter
* "eslint" - Linter
* "typescript" - Type checker
- "prettier" - Formatter
- "eslint" - Linter
- "typescript" - Type checker
They also need some support packages, which come from the leaf `@/build-config`
package:
* "@typescript-eslint/parser" - Tells ESLint how to read TypeScript syntax
* "@typescript-eslint/eslint-plugin" - Provides TypeScript rules and presets
* "eslint-plugin-react-hooks", "eslint-plugin-react-namespace-import" - Some
React specific ESLint rules and configurations that are used by the workspaces
that have React code.
* "prettier-plugin-organize-imports" - A Prettier plugin to sort imports.
* "prettier-plugin-packagejson" - A Prettier plugin to also prettify
- "@typescript-eslint/parser" - Tells ESLint how to read TypeScript syntax
- "@typescript-eslint/eslint-plugin" - Provides TypeScript rules and presets
- "eslint-plugin-react-hooks", "eslint-plugin-react-namespace-import" - Some
React specific ESLint rules and configurations that are used by the
workspaces that have React code.
- "prettier-plugin-organize-imports" - A Prettier plugin to sort imports.
- "prettier-plugin-packagejson" - A Prettier plugin to also prettify
`package.json`.
## Utils
@ -31,11 +31,11 @@ Emscripten, maintained by the original authors of libsodium themselves -
[libsodium-wrappers](https://github.com/jedisct1/libsodium.js).
Currently, we've pinned the version to 0.7.9 since later versions remove the
crypto_pwhash_* functionality that we use (they've not been deprecated, they've
just been moved to a different NPM package). From the (upstream) [release
notes](https://github.com/jedisct1/libsodium/releases/tag/1.0.19-RELEASE):
`crypto_pwhash_*` functionality that we use (they've not been deprecated,
they've just been moved to a different NPM package). From the (upstream)
[release notes](https://github.com/jedisct1/libsodium/releases/tag/1.0.19-RELEASE):
> Emscripten: the crypto_pwhash_*() functions have been removed from Sumo
> Emscripten: the `crypto_pwhash_*()` functions have been removed from Sumo
> builds, as they reserve a substantial amount of JavaScript memory, even when
> not used.
@ -62,13 +62,13 @@ preferred CSS-in-JS library.
Emotion itself comes in many parts, of which we need the following:
* "@emotion/react" - React interface to Emotion. In particular, we set this as
- "@emotion/react" - React interface to Emotion. In particular, we set this as
the package that handles the transformation of JSX into JS (via the
`jsxImportSource` property in `tsconfig.json`).
* "@emotion/styled" - Provides the `styled` utility, a la styled-components. We
don't use it directly, instead we import it from `@mui/material`. However, MUI
docs
- "@emotion/styled" - Provides the `styled` utility, a la styled-components.
We don't use it directly, instead we import it from `@mui/material`.
However, MUI docs
[mention](https://mui.com/material-ui/integrations/interoperability/#styled-components)
that
@ -87,8 +87,8 @@ selectors API (i.e using `styled.div` instead of `styled("div")`).
There is a way of enabling it by installing the `@emotion/babel-plugin` and
specifying the import map as mentioned
[here](https://mui.com/system/styled/#how-to-use-components-selector-api) ([full
example](https://github.com/mui/material-ui/issues/27380#issuecomment-928973157)),
[here](https://mui.com/system/styled/#how-to-use-components-selector-api)
([full example](https://github.com/mui/material-ui/issues/27380#issuecomment-928973157)),
but that disables the SWC integration altogether, so we live with this
infelicity for now.
@ -97,10 +97,11 @@ infelicity for now.
For showing the app's UI in multiple languages, we use the i18next library,
specifically its three components
* "i18next": The core `i18next` library.
* "i18next-http-backend": Adds support for initializing `i18next` with JSON file
containing the translation in a particular language, fetched at runtime.
* "react-i18next": React specific support in `i18next`.
- "i18next": The core `i18next` library.
- "i18next-http-backend": Adds support for initializing `i18next` with JSON
file containing the translation in a particular language, fetched at
runtime.
- "react-i18next": React specific support in `i18next`.
Note that inspite of the "next" in the name of the library, it has nothing to do
with Next.js.
@ -120,4 +121,3 @@ set of defaults for bundling our app into a static export which we can then
deploy to our webserver. In addition, the Next.js page router is convenient.
Apart from this, while we use a few tidbits from Next.js here and there, overall
our apps are regular React SPAs, and are not particularly tied to Next.

View file

@ -3,15 +3,15 @@
The various web apps and static sites in this repository are deployed on
Cloudflare Pages.
* Production deployments are triggered by pushing to the `deploy/*` branches.
- Production deployments are triggered by pushing to the `deploy/*` branches.
* [help.ente.io](https://help.ente.io) gets deployed whenever a PR that changes
anything inside `docs/` gets merged to `main`.
- [help.ente.io](https://help.ente.io) gets deployed whenever a PR that
changes anything inside `docs/` gets merged to `main`.
* Every night, all the web apps get automatically deployed to a nightly preview
URLs (`*.ente.sh`) using the current code in main.
- Every night, all the web apps get automatically deployed to a nightly
preview URLs (`*.ente.sh`) using the current code in main.
* A preview deployment can be made by triggering the "Preview (web)" workflow.
- A preview deployment can be made by triggering the "Preview (web)" workflow.
This allows us to deploy a build of any of the apps from an arbitrary branch
to [preview.ente.sh](https://preview.ente.sh).
@ -30,7 +30,7 @@ Here is a list of all the deployments, whether or not they are production
deployments, and the action that triggers them:
| URL | Type | Deployment action |
|-----|------|------------------|
| -------------------------------------------- | ---------- | ------------------------------------------- |
| [web.ente.io](https://web.ente.io) | Production | Push to `deploy/photos` |
| [photos.ente.io](https://photos.ente.io) | Production | Alias of [web.ente.io](https://web.ente.io) |
| [auth.ente.io](https://auth.ente.io) | Production | Push to `deploy/auth` |
@ -79,21 +79,20 @@ likely don't need to know them to be able to deploy.
## First time preparation
Create a new Pages project in Cloudflare, setting it up to use [Direct
Upload](https://developers.cloudflare.com/pages/get-started/direct-upload/).
Create a new Pages project in Cloudflare, setting it up to use
[Direct Upload](https://developers.cloudflare.com/pages/get-started/direct-upload/).
> [!NOTE]
>
> Direct upload doesn't work for existing projects tied to your repository using
> the [Git
> integration](https://developers.cloudflare.com/pages/get-started/git-integration/).
> the
> [Git integration](https://developers.cloudflare.com/pages/get-started/git-integration/).
>
> If you want to keep the pages.dev domain from an existing project, you should
> be able to delete your existing project and recreate it (assuming no one
> claims the domain in the middle). I've not seen this documented anywhere, but
> it worked when I tried, and it seems to have worked for [other people
> too](https://community.cloudflare.com/t/linking-git-repo-to-existing-cf-pages-project/530888).
> it worked when I tried, and it seems to have worked for
> [other people too](https://community.cloudflare.com/t/linking-git-repo-to-existing-cf-pages-project/530888).
There are two ways to create a new project, using Wrangler
[[1](https://github.com/cloudflare/pages-action/issues/51)] or using the
@ -101,9 +100,8 @@ Cloudflare dashboard
[[2](https://github.com/cloudflare/pages-action/issues/115)]. Since this is one
time thing, the second option might be easier.
The remaining steps are documented in [Cloudflare's guide for using Direct
Upload with
CI](https://developers.cloudflare.com/pages/how-to/use-direct-upload-with-continuous-integration/).
The remaining steps are documented in
[Cloudflare's guide for using Direct Upload with CI](https://developers.cloudflare.com/pages/how-to/use-direct-upload-with-continuous-integration/).
As a checklist,
- Generate `CLOUDFLARE_API_TOKEN`
@ -125,12 +123,11 @@ the `branch` parameter that gets passed to `cloudflare/pages-action`.
Since our root pages project is `ente.pages.dev`, so a branch named `foo` would
be available at `foo.ente.pages.dev`.
Finally, we create CNAME aliases using a [Custom Domain in
Cloudflare](https://developers.cloudflare.com/pages/how-to/custom-branch-aliases/)
Finally, we create CNAME aliases using a
[Custom Domain in Cloudflare](https://developers.cloudflare.com/pages/how-to/custom-branch-aliases/)
to point to these deployments from our user facing DNS names.
As a concrete example, the GitHub workflow that deploys `docs/` passes "help" as
the branch name. The resulting deployment is available at "help.ente.pages.dev".
Finally, we add a custom domain to point to it from
[help.ente.io](https://help.ente.io).

View file

@ -19,14 +19,14 @@ Make sure you're on yarn 1.x series (aka yarn "classic").
Installs dependencies. This needs to be done once, and thereafter wherever there
is a change in `yarn.lock` (e.g. when pulling the latest upstream).
### yarn dev:*
### yarn dev:\*
Launch the app in development mode. There is one `yarn dev:foo` for each app,
e.g. `yarn dev:auth`. `yarn dev` is a shortcut for `yarn dev:photos`.
The ports are different for the main apps (3000), various sidecars (3001, 3002).
### yarn build:*
### yarn build:\*
Build a production export for the app. This is a bunch of static HTML/JS/CSS
that can be then deployed to any web server.
@ -34,7 +34,7 @@ that can be then deployed to any web server.
There is one `yarn build:foo` for each app, e.g. `yarn build:auth`. The output
will be placed in `apps/<foo>/out`, e.g. `apps/auth/out`.
### yarn preview:*
### yarn preview:\*
Build a production export and start a local web server to serve it. This uses
Python's built in web server, and is okay for quick testing but should not be
@ -53,8 +53,8 @@ issues.
The monorepo uses Yarn (classic) workspaces.
To run a command for a workspace `<ws>`, invoke `yarn workspace <ws> <cmd>` from
the root folder instead the `yarn <cmd>` youd have done otherwise. For
example, to start a development server for the `photos` app, we can do
the root folder instead the `yarn <cmd>` youd have done otherwise. For example,
to start a development server for the `photos` app, we can do
```sh
yarn workspace photos next dev
@ -74,7 +74,7 @@ by running `yarn install` first.
> `yarn` is a shortcut for `yarn install`
To add a local package as a dependency, use `<package-name>@*`. The "*" here
To add a local package as a dependency, use `<package-name>@*`. The "\*" here
denotes any version.
```sh

View file

@ -7,26 +7,28 @@ Within our project we have the _source_ strings - these are the key value pairs
in the `public/locales/en-US/translation.json` file in each app.
Volunteers can add a new _translation_ in their language corresponding to each
such source key-value to our [Crowdin
project](https://crowdin.com/project/ente-photos-web).
such source key-value to our
[Crowdin project](https://crowdin.com/project/ente-photos-web).
Everyday, we run a [GitHub workflow](../../.github/workflows/web-crowdin.yml)
that
* Uploads sources to Crowdin - So any new key value pair we add in the source
- Uploads sources to Crowdin - So any new key value pair we add in the source
`translation.json` becomes available to translators to translate.
* Downloads translations from Crowdin - So any new translations that translators
have made on the Crowdin dashboard (for existing sources) will be added to the
corresponding `lang/translation.json`.
- Downloads translations from Crowdin - So any new translations that
translators have made on the Crowdin dashboard (for existing sources) will
be added to the corresponding `lang/translation.json`.
The workflow also uploads existing translations and also downloads new sources
from Crowdin, but these two should be no-ops.
## Adding a new string
- Add a new entry in `public/locales/en-US/translation.json` (the **source `translation.json`**).
- Use the new key in code with the `t` function (`import { t } from "i18next"`).
- Add a new entry in `public/locales/en-US/translation.json` (the **source
`translation.json`**).
- Use the new key in code with the `t` function
(`import { t } from "i18next"`).
- During the next sync, the workflow will upload this source item to Crowdin's
dashboard, allowing translators to translate it.
@ -40,6 +42,5 @@ from Crowdin, but these two should be no-ops.
- Remove the key value pair from the source `translation.json`.
- During the next sync, the workflow will delete that source item from all
existing translations (both in the Crowdin project and also from the he other
`lang/translation.json` files in the repository).
existing translations (both in the Crowdin project and also from the he
other `lang/translation.json` files in the repository).

View file

@ -1,6 +1,14 @@
# Passkeys on Ente
Passkeys is a colloquial term for a relatively new authentication standard called [WebAuthn](https://en.wikipedia.org/wiki/WebAuthn). Now rolled out to all major browsers and operating systems, it uses asymmetric cryptography to authenticate the user with a server using replay-attack resistant signatures. These processes are usually abstracted from the user through biometric prompts, such as Touch ID/Face ID/Optic ID, Fingerprint Unlock and Windows Hello. These passkeys can also be securely synced by major password managers, such as Bitwarden and 1Password, although the syncing experience can greatly vary due to some operating system restrictions.
Passkeys is a colloquial term for a relatively new authentication standard
called [WebAuthn](https://en.wikipedia.org/wiki/WebAuthn). Now rolled out to all
major browsers and operating systems, it uses asymmetric cryptography to
authenticate the user with a server using replay-attack resistant signatures.
These processes are usually abstracted from the user through biometric prompts,
such as Touch ID/Face ID/Optic ID, Fingerprint Unlock and Windows Hello. These
passkeys can also be securely synced by major password managers, such as
Bitwarden and 1Password, although the syncing experience can greatly vary due to
some operating system restrictions.
## Terms
@ -13,13 +21,20 @@ Passkeys is a colloquial term for a relatively new authentication standard calle
## Getting to the passkeys manager
As of Feb 2024, Ente clients have a button to navigate to a WebView of Ente Accounts. Ente Accounts allows users to add and manage their registered passkeys.
As of Feb 2024, Ente clients have a button to navigate to a WebView of Ente
Accounts. Ente Accounts allows users to add and manage their registered
passkeys.
❗ Your WebView MUST invoke the operating-system's default browser, or an equivalent browser with matching API parity. Otherwise, the user will not be able to register or use registered WebAuthn credentials.
❗ Your WebView MUST invoke the operating-system's default browser, or an
equivalent browser with matching API parity. Otherwise, the user will not be
able to register or use registered WebAuthn credentials.
### Accounts-Specific Session Token
When a user clicks this button, the client sends a request for an Accounts-specific JWT session token as shown below. **The Ente Accounts API is restricted to this type of session token, so the user session token cannot be used.** This restriction is a byproduct of the enablement for automatic login.
When a user clicks this button, the client sends a request for an
Accounts-specific JWT session token as shown below. **The Ente Accounts API is
restricted to this type of session token, so the user session token cannot be
used.** This restriction is a byproduct of the enablement for automatic login.
#### GET /users/accounts-token
@ -37,17 +52,33 @@ When a user clicks this button, the client sends a request for an Accounts-speci
### Automatically logging into Accounts
Clients open a WebView with the URL `https://accounts.ente.io/accounts-handoff?token=<accountsToken>&package=<app package name>`. This page will appear like a normal loading screen to the user, but in the background, the app parses the token and package for usage in subsequent Accounts-related API calls.
Clients open a WebView with the URL
`https://accounts.ente.io/accounts-handoff?token=<accountsToken>&package=<app package name>`.
This page will appear like a normal loading screen to the user, but in the
background, the app parses the token and package for usage in subsequent
Accounts-related API calls.
If valid, the user will be automatically redirected to the passkeys management page. Otherwise, they will be required to login with their Ente credentials.
If valid, the user will be automatically redirected to the passkeys management
page. Otherwise, they will be required to login with their Ente credentials.
## Registering a WebAuthn credential
### Requesting publicKey options (begin)
The registration ceremony starts in the browser. When the user clicks the "Add new passkey" button, a request is sent to the server for "public key" creation options. Although named "public key" options, they actually define customizable parameters for the entire credential creation process. They're like an instructional sheet that defines exactly what we want. As of the creation of this document, the plan is to restrict user authenticators to cross-platform ones, like hardware keys. Platform authenticators, such as TPM, are not portable and are prone to loss.
The registration ceremony starts in the browser. When the user clicks the "Add
new passkey" button, a request is sent to the server for "public key" creation
options. Although named "public key" options, they actually define customizable
parameters for the entire credential creation process. They're like an
instructional sheet that defines exactly what we want. As of the creation of
this document, the plan is to restrict user authenticators to cross-platform
ones, like hardware keys. Platform authenticators, such as TPM, are not portable
and are prone to loss.
On the server side, the WebAuthn library generates this information based on data provided from a `webauthn.User` interface. As a result, we satisfy this interface by creating a type with methods returning information from the database. Information stored in the database about credentials are all pre-processed using base64 where necessary.
On the server side, the WebAuthn library generates this information based on
data provided from a `webauthn.User` interface. As a result, we satisfy this
interface by creating a type with methods returning information from the
database. Information stored in the database about credentials are all
pre-processed using base64 where necessary.
```go
type PasskeyUser struct {
@ -162,24 +193,26 @@ func (u *PasskeyUser) WebAuthnCredentials() []webauthn.Credential {
### Pre-processing the options before registration
Even though the server generates these options, the browser still doesn't understand them. For interoperability, the server's WebAuthn library returns binary data in base64, like IDs and the challenge. However, the browser requires this data back in binary.
Even though the server generates these options, the browser still doesn't
understand them. For interoperability, the server's WebAuthn library returns
binary data in base64, like IDs and the challenge. However, the browser requires
this data back in binary.
We just have to decode the base64 fields back into `Uint8Array`.
```ts
const options = response.options;
options.publicKey.challenge = _sodium.from_base64(
options.publicKey.challenge,
);
options.publicKey.user.id = _sodium.from_base64(
options.publicKey.user.id,
);
options.publicKey.challenge = _sodium.from_base64(options.publicKey.challenge);
options.publicKey.user.id = _sodium.from_base64(options.publicKey.user.id);
```
### Creating the credential
We use `navigator.credentials.create` with these options to generate the credential. At this point, the user will see a prompt to decide where to save this credential, and probably a biometric authentication gate depending on the platform.
We use `navigator.credentials.create` with these options to generate the
credential. At this point, the user will see a prompt to decide where to save
this credential, and probably a biometric authentication gate depending on the
platform.
```ts
const newCredential = await navigator.credentials.create(options);
@ -187,7 +220,8 @@ const newCredential = await navigator.credentials.create(options);
### Sending the public key to the server (finish)
The browser returns the newly created credential with a bunch of binary fields, so we have to encode them into base64 for transport to the server.
The browser returns the newly created credential with a bunch of binary fields,
so we have to encode them into base64 for transport to the server.
```ts
const attestationObjectB64 = _sodium.to_base64(
@ -199,13 +233,18 @@ const clientDataJSONB64 = _sodium.to_base64(
_sodium.base64_variants.URLSAFE_NO_PADDING
```
Attestation object contains information about the nature of the credential, like what device it was generated on. Client data JSON contains metadata about the credential, like where it is registered to.
Attestation object contains information about the nature of the credential, like
what device it was generated on. Client data JSON contains metadata about the
credential, like where it is registered to.
After pre-processing, the client sends the public key to the server so it can verify future signatures during authentication.
After pre-processing, the client sends the public key to the server so it can
verify future signatures during authentication.
#### POST /passkeys/registration/finish
When the server receives the new public key credential, it pre-processes the JSON objects so they can fit within the database. This includes base64 encoding `[]byte` slices and their encompassing arrays or objects.
When the server receives the new public key credential, it pre-processes the
JSON objects so they can fit within the database. This includes base64 encoding
`[]byte` slices and their encompassing arrays or objects.
```go
// Convert the PublicKey to base64
@ -280,7 +319,11 @@ On retrieval, this process is effectively the opposite.
## Authenticating with a credential
Passkeys have been integrated into the existing two-factor ceremony. When logging in via SRP or verifying an email OTT, the server checks if the user has any number of credentials setup or has 2FA TOTP enabled. If the user has setup at least one credential, they will be served a `passkeySessionID` which will initiate the authentication ceremony.
Passkeys have been integrated into the existing two-factor ceremony. When
logging in via SRP or verifying an email OTT, the server checks if the user has
any number of credentials setup or has 2FA TOTP enabled. If the user has setup
at least one credential, they will be served a `passkeySessionID` which will
initiate the authentication ceremony.
```tsx
const {
@ -294,7 +337,9 @@ if (passkeySessionID) {
}
```
The client should redirect the user to Accounts with this session ID to prompt credential authentication. We use Accounts as the central WebAuthn hub because credentials are locked to an FQDN.
The client should redirect the user to Accounts with this session ID to prompt
credential authentication. We use Accounts as the central WebAuthn hub because
credentials are locked to an FQDN.
```tsx
window.location.href = `${getAccountsURL()}/passkeys/flow?passkeySessionID=${passkeySessionID}&redirect=${
@ -344,7 +389,8 @@ window.location.href = `${getAccountsURL()}/passkeys/flow?passkeySessionID=${pas
### Pre-processing the options before retrieval
The browser requires `Uint8Array` versions of the `options` challenge and credential IDs.
The browser requires `Uint8Array` versions of the `options` challenge and
credential IDs.
```ts
publicKey.challenge = _sodium.from_base64(
@ -369,7 +415,8 @@ const credential = await navigator.credentials.get({
### Pre-processing the credential metadata and signature before authentication
Before sending the public key and signature to the server, their outputs must be encoded into Base64.
Before sending the public key and signature to the server, their outputs must be
encoded into Base64.
```ts
authenticatorData: _sodium.to_base64(

View file

@ -13,8 +13,8 @@ transpiled, it just exports static files that can be included verbatim.
Too see what tsc is seeing (say when it is trying to type-check `@/utils`), use
`yarn workspace @/utils tsc --showConfig`.
Similarly, to verify what ESLint is trying to do, use `yarn workspace @/utils
eslint --debug .`
Similarly, to verify what ESLint is trying to do, use
`yarn workspace @/utils eslint --debug .`
If the issue is in VSCode, open the output window of the corresponding plugin,
it might be telling us what's going wrong there. In particular, when changing

View file

@ -6,4 +6,3 @@ A base package for our UI layer code, for sharing ode between our Next.js apps.
This (internal) package exports a React TypeScript library. We rely on the
importing project to transpile and bundle it.