Initial commit - Event Planner application

This commit is contained in:
mberlin
2026-03-18 14:55:56 -03:00
commit 86d779eb4d
7548 changed files with 1006324 additions and 0 deletions

21
node_modules/kysely/LICENSE generated vendored Normal file
View File

@@ -0,0 +1,21 @@
The MIT License (MIT)
Copyright (c) 2022 Sami Koskimäki
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

240
node_modules/kysely/README.md generated vendored Normal file
View File

@@ -0,0 +1,240 @@
[![Stand With Ukraine](https://raw.githubusercontent.com/vshymanskyy/StandWithUkraine/main/banner2-direct.svg)](https://stand-with-ukraine.pp.ua)
[![NPM Version](https://img.shields.io/npm/v/kysely?style=flat&label=latest)](https://github.com/kysely-org/kysely/releases/latest)
[![Tests](https://github.com/kysely-org/kysely/actions/workflows/test.yml/badge.svg)](https://github.com/kysely-org/kysely)
[![License](https://img.shields.io/github/license/kysely-org/kysely?style=flat)](https://github.com/kysely-org/kysely/blob/master/LICENSE)
[![Issues](https://img.shields.io/github/issues-closed/kysely-org/kysely?logo=github)](https://github.com/kysely-org/kysely/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc)
[![Pull Requests](https://img.shields.io/github/issues-pr-closed/kysely-org/kysely?label=PRs&logo=github&style=flat)](https://github.com/kysely-org/kysely/pulls?q=is%3Apr+is%3Aopen+sort%3Aupdated-desc)
![GitHub contributors](https://img.shields.io/github/contributors/kysely-org/kysely)
[![NPM Downloads](https://img.shields.io/npm/dw/kysely?logo=npm)](https://www.npmjs.com/package/kysely)
[![JSR Downloads](https://jsr.io/badges/@kysely/kysely/weekly-downloads)](https://jsr.io/@kysely/kysely)
[![JSR Score](https://jsr.io/badges/@kysely/kysely/score)](https://jsr.io/@kysely/kysely)
###### Join the discussion
[![Discord](https://img.shields.io/badge/Discord-%235865F2.svg?style=flat&logo=discord&logoColor=white)](https://discord.gg/xyBJ3GwvAm)
[![Bluesky](https://img.shields.io/badge/Bluesky-0285FF?style=flat&logo=Bluesky&logoColor=white)](https://bsky.app/profile/kysely.dev)
###### Get started
[![Postgres](https://img.shields.io/badge/postgres-%23316192.svg?style=flat&logo=postgresql&logoColor=white)](https://kysely.dev/docs/getting-started?dialect=postgresql)
[![MySQL](https://img.shields.io/badge/mysql-4479A1.svg?style=flat&logo=mysql&logoColor=white)](https://kysely.dev/docs/getting-started?dialect=mysql)
[![MicrosoftSQLServer](https://img.shields.io/badge/Microsoft%20SQL%20Server-CC2927?style=flat&logo=microsoft%20sql%20server&logoColor=white)](https://kysely.dev/docs/getting-started?dialect=mssql)
[![SQLite](https://img.shields.io/badge/sqlite-%2307405e.svg?style=flat&logo=sqlite&logoColor=white)](https://kysely.dev/docs/getting-started?dialect=sqlite)
& more!
# [Kysely](https://kysely.dev)
Kysely (pronounce “Key-Seh-Lee”) is a type-safe and autocompletion-friendly [TypeScript](https://www.typescriptlang.org/) [SQL](https://en.wikipedia.org/wiki/SQL) query builder.
Inspired by [Knex.js](http://knexjs.org/). Mainly developed for [Node.js](https://nodejs.org/en/) but also
runs on all other [JavaScript](https://developer.mozilla.org/en-US/docs/Web/JavaScript) environments like [Deno](https://deno.com/), [Bun](https://bun.sh/), [Cloudflare Workers](https://workers.cloudflare.com/)
and web browsers.
![](https://github.com/kysely-org/kysely/blob/master/assets/demo.gif)
Kysely makes sure you only refer to tables and columns that are visible to the part of the query
you're writing. The result type only has the selected columns with correct types and aliases. As an
added bonus you get autocompletion for all that stuff.
As shown in the gif above, through the pure magic of modern TypeScript, Kysely is even able to parse
the alias given to `pet.name` and add the `pet_name` column to the result row type. Kysely is able to infer
column names, aliases and types from selected subqueries, joined subqueries, `with` statements and pretty
much anything you can think of.
Of course there are cases where things cannot be typed at compile time, and Kysely offers escape
hatches for these situations. See the [sql template tag](https://kysely-org.github.io/kysely-apidoc/interfaces/Sql.html)
and the [DynamicModule](https://kysely-org.github.io/kysely-apidoc/classes/DynamicModule.html#ref) for more info.
All API documentation is written in the typing files and you can simply hover over the module, class
or method you're using to see it in your IDE. The same documentation is also hosted [here](https://kysely-org.github.io/kysely-apidoc/).
If you start using Kysely and can't find something you'd want to use, please open an issue or join our
[Discord server](https://discord.gg/xyBJ3GwvAm).
# Getting started
Please visit our documentation site [kysely.dev](https://kysely.dev) to get started. We also have a comprehensive
API documentation hosted [here](https://kysely-org.github.io/kysely-apidoc/), but you can access the same
documentation in your IDE by hovering over a class/method/property/whatever.
# Core team
## Project leads
Responsible for project direction, API design, maintenance, code reviews, community support, documentation, and working on some of the most
impactful/challenging things.
<table>
<tbody>
<tr>
<td align="center">
<a href="https://github.com/koskimas">
<img src="https://avatars.githubusercontent.com/u/846508?v=4?s=100" width="100px;" alt=""/>
<br />
Sami Koskimäki
</a>
<br />
(the <a href="https://web.archive.org/web/20211203210043/https://www.jakso.me/blog/kysely-a-type-safe-sql-query-builder-for-typescript">author</a>)
</td>
<td align="center">
<a href="https://github.com/igalklebanov">
<img src="https://avatars.githubusercontent.com/u/14938291?v=4&s=100" width="100px;" alt=""/>
<br />
Igal Klebanov
</a>
<br />
(the <a href="https://github.com/kysely-org/kysely/pull/1414#issuecomment-2781281996">dynamo</a>)
</td>
</tr>
</tbody>
</table>
## Honorable mentions
People who had special impact on the project and its growth.
<table>
<tbody>
<tr>
<td align="center">
<a href="https://github.com/fhur">
<img src="https://avatars.githubusercontent.com/u/6452323?v=4&s=100" width="100px;" alt=""/>
<br />
Fernando Hurtado
</a>
<br />
(1st <a href="https://kysely.dev">docs</a>)
</td>
<td align="center">
<a href="https://github.com/wirekang">
<img src="https://avatars.githubusercontent.com/u/43294688?v=4&s=100" width="100px;" alt=""/>
<br />
Wirekang
</a>
<br />
(<a href="https://kyse.link">playground</a>)
</td>
<td align="center">
<a href="https://github.com/tgriesser">
<img src="https://avatars.githubusercontent.com/u/154748?v=4&s=100" width="100px;" alt=""/>
<br />
Tim Griesser
</a>
<br />
(<a href="https://knexjs.org/">Knex</a>)
</td>
</tr>
<tr>
<td align="center">
<a href="https://github.com/RobinBlomberg">
<img src="https://avatars.githubusercontent.com/u/20827397?v=4&s=100" width="100px;" alt=""/>
<br />
Robin Blomberg
</a>
<br />
(<a href="https://github.com/RobinBlomberg/kysely-codegen">codegen</a>)
</td>
<td align="center">
<a href="https://github.com/nexxeln">
<img src="https://avatars.githubusercontent.com/u/95541290?v=4&s=100" width="100px" alt="" />
<br />
Shoubhit Dash
</a>
<br />
(prisma <a href="https://www.nexxel.dev/blog/typesafe-database">idea</a>)
</td>
<td align="center">
<a href="https://github.com/nexxeln">
<img src="https://avatars.githubusercontent.com/u/3050355?v=4&s=100" width="100px" alt="" />
<br />
Valtýr Örn Kjartansson
</a>
<br />
(prisma <a href="https://github.com/valtyr/prisma-kysely">impl</a>)
</td>
</tr>
<tr>
<td align="center">
<a href="https://github.com/thdxr">
<img src="https://avatars.githubusercontent.com/u/826656?v=4&s=100" width="100px;" alt=""/>
<br />
Dax Raad
</a>
<br />
(early <a href="https://thdxr.com/post/serverless-relational-showdown">adopter</a>)
</td>
<td align="center">
<a href="https://github.com/t3dotgg">
<img src="https://avatars.githubusercontent.com/u/6751787?v=4&s=100" width="100px;" alt=""/>
<br />
Theo Browne
</a>
<br />
(early <a href="https://discord.com/channels/966627436387266600/988912020558602331/993220628154961930">promoter</a>)
</td>
<td align="center">
<a href="https://github.com/leerob">
<img src="https://avatars.githubusercontent.com/u/9113740?v=4&s=100" width="100px;" alt="" />
<br />
Lee Robinson
</a>
<br />
(early <a href="https://x.com/leerob/status/1576929372811849730">promoter</a>)
</td>
</tr>
<tr>
<td align="center">
<a href="https://github.com/ethanresnick">
<img src="https://avatars.githubusercontent.com/u/471894?v=4&s=100" width="100px" alt="" />
<br />
Ethan Resnick
</a>
<br />
(timely <a href="https://github.com/kysely-org/kysely/issues/494">feedback</a>)
</td>
<td align="center">
<a href="https://github.com/thetutlage">
<img src="https://avatars.githubusercontent.com/u/1706381?v=4&s=100" width="100px;" alt="" />
<br />
Harminder Virk
</a>
<br />
(dope <a href="https://github.com/thetutlage/meta/discussions/8">writeup</a>)
</td>
<td align="center">
<a href="https://github.com/elitan">
<img src="https://avatars.githubusercontent.com/u/331818?v=4&s=100" width="100px;" alt="" />
<br />
Johan Eliasson
</a>
<br />
(<a href="https://eliasson.me/articles/crafting-the-perfect-t3-stack-my-journey-with-kysely-atlas-and-clerk">promoter</a>/<a href="https://www.youtube.com/watch?v=u2s39dRIpCM">educator</a>)
</td>
</tr>
<!-- <tr>
<td align="center">
<a href="">
<img src="" width="100px;" alt="" />
<br />
Name
</a>
<br />
(contribution)
</td>
</tr> -->
</tbody>
</table>
## All contributors
<p align="center">
<a href="https://github.com/kysely-org/kysely/graphs/contributors">
<img src="https://contrib.rocks/image?repo=kysely-org/kysely" />
</a>
</br>
<span>Want to contribute? Check out our <a href="./CONTRIBUTING.md" >contribution guidelines</a>.</span>
</p>
<p align="center">
<a href="https://vercel.com/?utm_source=kysely&utm_campaign=oss">
<img src="https://kysely.dev/img/powered-by-vercel.svg" alt="Powered by Vercel" />
</a>
</p>

View File

@@ -0,0 +1,62 @@
/**
* An interface for getting the database metadata (names of the tables and columns etc.)
*/
export interface DatabaseIntrospector {
/**
* Get schema metadata.
*/
getSchemas(): Promise<SchemaMetadata[]>;
/**
* Get tables and views metadata.
*/
getTables(options?: DatabaseMetadataOptions): Promise<TableMetadata[]>;
/**
* Get the database metadata such as table and column names.
*
* @deprecated Use getTables() instead.
*/
getMetadata(options?: DatabaseMetadataOptions): Promise<DatabaseMetadata>;
}
export interface DatabaseMetadataOptions {
/**
* If this is true, the metadata contains the internal kysely tables
* such as the migration tables.
*/
withInternalKyselyTables: boolean;
}
export interface SchemaMetadata {
readonly name: string;
}
export interface DatabaseMetadata {
/**
* The tables and views found in the database.
* The propery isView can be used to tell them apart.
*/
readonly tables: TableMetadata[];
}
export interface TableMetadata {
readonly name: string;
readonly isView: boolean;
readonly columns: ColumnMetadata[];
readonly schema?: string;
}
export interface ColumnMetadata {
readonly name: string;
/**
* The data type of the column as reported by the database.
*
* NOTE: This value is whatever the database engine returns and it will be
* different on different dialects even if you run the same migrations.
* For example `integer` datatype in a migration will produce `int4`
* on PostgreSQL, `INTEGER` on SQLite and `int` on MySQL.
*/
readonly dataType: string;
/**
* The schema this column's data type was created in.
*/
readonly dataTypeSchema?: string;
readonly isAutoIncrementing: boolean;
readonly isNullable: boolean;
readonly hasDefaultValue: boolean;
readonly comment?: string;
}

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,94 @@
import type { Kysely } from '../kysely.js';
import type { DialectAdapter, MigrationLockOptions } from './dialect-adapter.js';
/**
* A basic implementation of `DialectAdapter` with sensible default values.
* Third-party dialects can extend this instead of implementing the `DialectAdapter`
* interface from scratch. That way all new settings will get default values when
* they are added and there will be less breaking changes.
*/
export declare abstract class DialectAdapterBase implements DialectAdapter {
/**
* Whether or not this dialect supports `if not exists` in creation of tables/schemas/views/etc.
*
* If this is false, Kysely's internal migrations tables and schemas are created
* without `if not exists` in migrations. This is not a problem if the dialect
* supports transactional DDL.
*/
get supportsCreateIfNotExists(): boolean;
/**
* Whether or not this dialect supports transactional DDL.
*
* If this is true, migrations are executed inside a transaction.
*/
get supportsTransactionalDdl(): boolean;
/**
* Whether or not this dialect supports the `returning` in inserts
* updates and deletes.
*/
get supportsReturning(): boolean;
get supportsOutput(): boolean;
/**
* This method is used to acquire a lock for the migrations so that
* it's not possible for two migration operations to run in parallel.
*
* Most dialects have explicit locks that can be used, like advisory locks
* in PostgreSQL and the get_lock function in MySQL.
*
* If the dialect doesn't have explicit locks the {@link MigrationLockOptions.lockTable}
* created by Kysely can be used instead. You can access it through the `options` object.
* The lock table has two columns `id` and `is_locked` and there's only one row in the table
* whose id is {@link MigrationLockOptions.lockRowId}. `is_locked` is an integer. Kysely
* takes care of creating the lock table and inserting the one single row to it before this
* method is executed. If the dialect supports schemas and the user has specified a custom
* schema in their migration settings, the options object also contains the schema name in
* {@link MigrationLockOptions.lockTableSchema}.
*
* Here's an example of how you might implement this method for a dialect that doesn't
* have explicit locks but supports `FOR UPDATE` row locks and transactional DDL:
*
* ```ts
* import { DialectAdapterBase, type MigrationLockOptions, Kysely } from 'kysely'
*
* export class MyAdapter extends DialectAdapterBase {
* override async acquireMigrationLock(
* db: Kysely<any>,
* options: MigrationLockOptions
* ): Promise<void> {
* const queryDb = options.lockTableSchema
* ? db.withSchema(options.lockTableSchema)
* : db
*
* // Since our imaginary dialect supports transactional DDL and has
* // row locks, we can simply take a row lock here and it will guarantee
* // all subsequent calls to this method from other transactions will
* // wait until this transaction finishes.
* await queryDb
* .selectFrom(options.lockTable)
* .selectAll()
* .where('id', '=', options.lockRowId)
* .forUpdate()
* .execute()
* }
*
* override async releaseMigrationLock() {
* // noop
* }
* }
* ```
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations will be executed. Otherwise
* `db` is a single connection (session) that will be used to execute the
* migrations.
*/
abstract acquireMigrationLock(db: Kysely<any>, options: MigrationLockOptions): Promise<void>;
/**
* Releases the migration lock. See {@link acquireMigrationLock}.
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations were executed. Otherwise `db`
* is a single connection (session) that was used to execute the migrations
* and the `acquireMigrationLock` call.
*/
abstract releaseMigrationLock(db: Kysely<any>, options: MigrationLockOptions): Promise<void>;
}

View File

@@ -0,0 +1,24 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.DialectAdapterBase = void 0;
/**
* A basic implementation of `DialectAdapter` with sensible default values.
* Third-party dialects can extend this instead of implementing the `DialectAdapter`
* interface from scratch. That way all new settings will get default values when
* they are added and there will be less breaking changes.
*/
class DialectAdapterBase {
get supportsCreateIfNotExists() {
return true;
}
get supportsTransactionalDdl() {
return false;
}
get supportsReturning() {
return false;
}
get supportsOutput() {
return false;
}
}
exports.DialectAdapterBase = DialectAdapterBase;

View File

@@ -0,0 +1,115 @@
import type { Kysely } from '../kysely.js';
/**
* A `DialectAdapter` encapsulates all differences between dialects outside
* of `Driver` and `QueryCompiler`.
*
* For example, some databases support transactional DDL and therefore we want
* to run migrations inside a transaction, while other databases don't support
* it. For that there's a `supportsTransactionalDdl` boolean in this interface.
*/
export interface DialectAdapter {
/**
* Whether or not this dialect supports `if not exists` in creation of tables/schemas/views/etc.
*
* If this is false, Kysely's internal migrations tables and schemas are created
* without `if not exists` in migrations. This is not a problem if the dialect
* supports transactional DDL.
*/
readonly supportsCreateIfNotExists: boolean;
/**
* Whether or not this dialect supports transactional DDL.
*
* If this is true, migrations are executed inside a transaction.
*/
readonly supportsTransactionalDdl: boolean;
/**
* Whether or not this dialect supports the `returning` in inserts
* updates and deletes.
*/
readonly supportsReturning: boolean;
/**
* Whether or not this dialect supports the `output` clause in inserts
* updates and deletes.
*/
readonly supportsOutput?: boolean;
/**
* This method is used to acquire a lock for the migrations so that
* it's not possible for two migration operations to run in parallel.
*
* Most dialects have explicit locks that can be used, like advisory locks
* in PostgreSQL and the get_lock function in MySQL.
*
* If the dialect doesn't have explicit locks the {@link MigrationLockOptions.lockTable}
* created by Kysely can be used instead. You can access it through the `options` object.
* The lock table has two columns `id` and `is_locked` and there's only one row in the table
* whose id is {@link MigrationLockOptions.lockRowId}. `is_locked` is an integer. Kysely
* takes care of creating the lock table and inserting the one single row to it before this
* method is executed. If the dialect supports schemas and the user has specified a custom
* schema in their migration settings, the options object also contains the schema name in
* {@link MigrationLockOptions.lockTableSchema}.
*
* Here's an example of how you might implement this method for a dialect that doesn't
* have explicit locks but supports `FOR UPDATE` row locks and transactional DDL:
*
* ```ts
* import { DialectAdapterBase, type MigrationLockOptions, Kysely } from 'kysely'
*
* export class MyAdapter extends DialectAdapterBase {
* override async acquireMigrationLock(
* db: Kysely<any>,
* options: MigrationLockOptions
* ): Promise<void> {
* const queryDb = options.lockTableSchema
* ? db.withSchema(options.lockTableSchema)
* : db
*
* // Since our imaginary dialect supports transactional DDL and has
* // row locks, we can simply take a row lock here and it will guarantee
* // all subsequent calls to this method from other transactions will
* // wait until this transaction finishes.
* await queryDb
* .selectFrom(options.lockTable)
* .selectAll()
* .where('id', '=', options.lockRowId)
* .forUpdate()
* .execute()
* }
*
* override async releaseMigrationLock() {
* // noop
* }
* }
* ```
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations will be executed. Otherwise
* `db` is a single connection (session) that will be used to execute the
* migrations.
*/
acquireMigrationLock(db: Kysely<any>, options: MigrationLockOptions): Promise<void>;
/**
* Releases the migration lock. See {@link acquireMigrationLock}.
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations were executed. Otherwise `db`
* is a single connection (session) that was used to execute the migrations
* and the `acquireMigrationLock` call.
*/
releaseMigrationLock(db: Kysely<any>, options: MigrationLockOptions): Promise<void>;
}
export interface MigrationLockOptions {
/**
* The name of the migration lock table.
*/
readonly lockTable: string;
/**
* The id of the only row in the migration lock table.
*/
readonly lockRowId: string;
/**
* The schema in which the migration lock table lives. This is only
* defined if the user has specified a custom schema in the migration
* settings.
*/
readonly lockTableSchema?: string;
}

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

34
node_modules/kysely/dist/cjs/dialect/dialect.d.ts generated vendored Normal file
View File

@@ -0,0 +1,34 @@
import type { Driver } from '../driver/driver.js';
import type { Kysely } from '../kysely.js';
import type { QueryCompiler } from '../query-compiler/query-compiler.js';
import type { DatabaseIntrospector } from './database-introspector.js';
import type { DialectAdapter } from './dialect-adapter.js';
/**
* A Dialect is the glue between Kysely and the underlying database engine.
*
* See the built-in {@link PostgresDialect} as an example of a dialect.
* Users can implement their own dialects and use them by passing it
* in the {@link KyselyConfig.dialect} property.
*/
export interface Dialect {
/**
* Creates a driver for the dialect.
*/
createDriver(): Driver;
/**
* Creates a query compiler for the dialect.
*/
createQueryCompiler(): QueryCompiler;
/**
* Creates an adapter for the dialect.
*/
createAdapter(): DialectAdapter;
/**
* Creates a database introspector that can be used to get database metadata
* such as the table names and column names of those tables.
*
* `db` never has any plugins installed. It's created using
* {@link Kysely.withoutPlugins}.
*/
createIntrospector(db: Kysely<any>): DatabaseIntrospector;
}

2
node_modules/kysely/dist/cjs/dialect/dialect.js generated vendored Normal file
View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,83 @@
import type { Kysely } from '../../kysely.js';
import { DialectAdapterBase } from '../dialect-adapter-base.js';
export declare class MssqlAdapter extends DialectAdapterBase {
/**
* Whether or not this dialect supports `if not exists` in creation of tables/schemas/views/etc.
*
* If this is false, Kysely's internal migrations tables and schemas are created
* without `if not exists` in migrations. This is not a problem if the dialect
* supports transactional DDL.
*/
get supportsCreateIfNotExists(): boolean;
/**
* Whether or not this dialect supports transactional DDL.
*
* If this is true, migrations are executed inside a transaction.
*/
get supportsTransactionalDdl(): boolean;
get supportsOutput(): boolean;
/**
* This method is used to acquire a lock for the migrations so that
* it's not possible for two migration operations to run in parallel.
*
* Most dialects have explicit locks that can be used, like advisory locks
* in PostgreSQL and the get_lock function in MySQL.
*
* If the dialect doesn't have explicit locks the {@link MigrationLockOptions.lockTable}
* created by Kysely can be used instead. You can access it through the `options` object.
* The lock table has two columns `id` and `is_locked` and there's only one row in the table
* whose id is {@link MigrationLockOptions.lockRowId}. `is_locked` is an integer. Kysely
* takes care of creating the lock table and inserting the one single row to it before this
* method is executed. If the dialect supports schemas and the user has specified a custom
* schema in their migration settings, the options object also contains the schema name in
* {@link MigrationLockOptions.lockTableSchema}.
*
* Here's an example of how you might implement this method for a dialect that doesn't
* have explicit locks but supports `FOR UPDATE` row locks and transactional DDL:
*
* ```ts
* import { DialectAdapterBase, type MigrationLockOptions, Kysely } from 'kysely'
*
* export class MyAdapter extends DialectAdapterBase {
* override async acquireMigrationLock(
* db: Kysely<any>,
* options: MigrationLockOptions
* ): Promise<void> {
* const queryDb = options.lockTableSchema
* ? db.withSchema(options.lockTableSchema)
* : db
*
* // Since our imaginary dialect supports transactional DDL and has
* // row locks, we can simply take a row lock here and it will guarantee
* // all subsequent calls to this method from other transactions will
* // wait until this transaction finishes.
* await queryDb
* .selectFrom(options.lockTable)
* .selectAll()
* .where('id', '=', options.lockRowId)
* .forUpdate()
* .execute()
* }
*
* override async releaseMigrationLock() {
* // noop
* }
* }
* ```
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations will be executed. Otherwise
* `db` is a single connection (session) that will be used to execute the
* migrations.
*/
acquireMigrationLock(db: Kysely<any>): Promise<void>;
/**
* Releases the migration lock. See {@link acquireMigrationLock}.
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations were executed. Otherwise `db`
* is a single connection (session) that was used to execute the migrations
* and the `acquireMigrationLock` call.
*/
releaseMigrationLock(): Promise<void>;
}

View File

@@ -0,0 +1,28 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MssqlAdapter = void 0;
const migrator_js_1 = require("../../migration/migrator.js");
const sql_js_1 = require("../../raw-builder/sql.js");
const dialect_adapter_base_js_1 = require("../dialect-adapter-base.js");
class MssqlAdapter extends dialect_adapter_base_js_1.DialectAdapterBase {
get supportsCreateIfNotExists() {
return false;
}
get supportsTransactionalDdl() {
return true;
}
get supportsOutput() {
return true;
}
async acquireMigrationLock(db) {
// Acquire a transaction-level exclusive lock on the migrations table.
// https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-getapplock-transact-sql?view=sql-server-ver16
await (0, sql_js_1.sql) `exec sp_getapplock @DbPrincipal = ${sql_js_1.sql.lit('dbo')}, @Resource = ${sql_js_1.sql.lit(migrator_js_1.DEFAULT_MIGRATION_TABLE)}, @LockMode = ${sql_js_1.sql.lit('Exclusive')}`.execute(db);
}
async releaseMigrationLock() {
// Nothing to do here. `sp_getapplock` is automatically released at the
// end of the transaction and since `supportsTransactionalDdl` true, we know
// the `db` instance passed to acquireMigrationLock is actually a transaction.
}
}
exports.MssqlAdapter = MssqlAdapter;

View File

@@ -0,0 +1,181 @@
import type { KyselyTypeError } from '../../util/type-error.js';
export interface MssqlDialectConfig {
/**
* When `true`, connections are reset to their initial states when released
* back to the pool, resulting in additional requests to the database.
*
* Defaults to `false`.
*/
resetConnectionsOnRelease?: boolean;
/**
* This dialect uses the `tarn` package to manage the connection pool to your
* database. To use it as a peer dependency and not bundle it with Kysely's code,
* you need to pass the `tarn` package itself. You also need to pass some pool options
* (excluding `create`, `destroy` and `validate` functions which are controlled by this dialect),
* `min` & `max` connections at the very least.
*
* ### Examples
*
* ```ts
* import { MssqlDialect } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
*
* const dialect = new MssqlDialect({
* tarn: { ...Tarn, options: { max: 10, min: 0 } },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* // ...
* server: 'localhost',
* // ...
* }),
* }
* })
* ```
*/
tarn: Tarn;
/**
* This dialect uses the `tedious` package to communicate with your MS SQL Server
* database. To use it as a peer dependency and not bundle it with Kysely's code,
* you need to pass the `tedious` package itself. You also need to pass a factory
* function that creates new `tedious` `Connection` instances on demand.
*
* ### Examples
*
* ```ts
* import { MssqlDialect } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
*
* const dialect = new MssqlDialect({
* tarn: { ...Tarn, options: { max: 10, min: 0 } },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* // ...
* server: 'localhost',
* // ...
* }),
* }
* })
* ```
*/
tedious: Tedious;
/**
* When `true`, connections are validated before being acquired from the pool,
* resulting in additional requests to the database.
*
* Defaults to `true`.
*/
validateConnections?: boolean;
}
export interface Tedious {
connectionFactory: () => TediousConnection | Promise<TediousConnection>;
ISOLATION_LEVEL: TediousIsolationLevel;
Request: TediousRequestClass;
/**
* @deprecated use {@link MssqlDialectConfig.resetConnectionsOnRelease} instead.
*/
resetConnectionOnRelease?: KyselyTypeError<'deprecated: use `MssqlDialectConfig.resetConnectionsOnRelease` instead'>;
TYPES: TediousTypes;
}
export interface TediousConnection {
beginTransaction(callback: (err: Error | null | undefined, transactionDescriptor?: any) => void, name?: string | undefined, isolationLevel?: number | undefined): void;
cancel(): boolean;
close(): void;
commitTransaction(callback: (err: Error | null | undefined) => void, name?: string | undefined): void;
connect(connectListener: (err?: Error) => void): void;
execSql(request: TediousRequest): void;
off(event: 'error', listener: (error: unknown) => void): this;
off(event: string, listener: (...args: any[]) => void): this;
on(event: 'error', listener: (error: unknown) => void): this;
on(event: string, listener: (...args: any[]) => void): this;
once(event: 'end', listener: () => void): this;
once(event: string, listener: (...args: any[]) => void): this;
reset(callback: (err: Error | null | undefined) => void): void;
rollbackTransaction(callback: (err: Error | null | undefined) => void, name?: string | undefined): void;
saveTransaction(callback: (err: Error | null | undefined) => void, name: string): void;
}
export type TediousIsolationLevel = Record<string, number>;
export interface TediousRequestClass {
new (sqlTextOrProcedure: string | undefined, callback: (error?: Error | null, rowCount?: number, rows?: any) => void, options?: {
statementColumnEncryptionSetting?: any;
}): TediousRequest;
}
export declare class TediousRequest {
addParameter(name: string, dataType: TediousDataType, value?: unknown, options?: Readonly<{
output?: boolean;
length?: number;
precision?: number;
scale?: number;
}> | null): void;
off(event: 'row', listener: (columns: any) => void): this;
off(event: string, listener: (...args: any[]) => void): this;
on(event: 'row', listener: (columns: any) => void): this;
on(event: string, listener: (...args: any[]) => void): this;
once(event: 'requestCompleted', listener: () => void): this;
once(event: string, listener: (...args: any[]) => void): this;
pause(): void;
resume(): void;
}
export interface TediousTypes {
NVarChar: TediousDataType;
BigInt: TediousDataType;
Int: TediousDataType;
Float: TediousDataType;
Bit: TediousDataType;
DateTime: TediousDataType;
VarBinary: TediousDataType;
[x: string]: TediousDataType;
}
export interface TediousDataType {
}
export interface TediousColumnValue {
metadata: {
colName: string;
};
value: any;
}
export interface Tarn {
/**
* Tarn.js' pool options, excluding `create`, `destroy` and `validate` functions,
* which must be implemented by this dialect.
*/
options: Omit<TarnPoolOptions<any>, 'create' | 'destroy' | 'validate'> & {
/**
* @deprecated use {@link MssqlDialectConfig.validateConnections} instead.
*/
validateConnections?: KyselyTypeError<'deprecated: use `MssqlDialectConfig.validateConnections` instead'>;
};
/**
* Tarn.js' Pool class.
*/
Pool: typeof TarnPool;
}
export declare class TarnPool<R> {
constructor(opt: TarnPoolOptions<R>);
acquire(): TarnPendingRequest<R>;
destroy(): any;
release(resource: R): void;
}
export interface TarnPoolOptions<R> {
acquireTimeoutMillis?: number;
create(cb: (err: Error | null, resource: R) => void): any | (() => Promise<R>);
createRetryIntervalMillis?: number;
createTimeoutMillis?: number;
destroy(resource: R): any;
destroyTimeoutMillis?: number;
idleTimeoutMillis?: number;
log?(msg: string): any;
max: number;
min: number;
propagateCreateError?: boolean;
reapIntervalMillis?: number;
validate?(resource: R): boolean;
}
export interface TarnPendingRequest<R> {
promise: Promise<R>;
resolve: (resource: R) => void;
reject: (err: Error) => void;
}

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,70 @@
import type { Driver } from '../../driver/driver.js';
import type { Kysely } from '../../kysely.js';
import type { QueryCompiler } from '../../query-compiler/query-compiler.js';
import type { DatabaseIntrospector } from '../database-introspector.js';
import type { DialectAdapter } from '../dialect-adapter.js';
import type { Dialect } from '../dialect.js';
import type { MssqlDialectConfig } from './mssql-dialect-config.js';
/**
* MS SQL Server dialect that uses the [tedious](https://tediousjs.github.io/tedious)
* library.
*
* The constructor takes an instance of {@link MssqlDialectConfig}.
*
* ```ts
* import * as Tedious from 'tedious'
* import * as Tarn from 'tarn'
*
* const dialect = new MssqlDialect({
* tarn: {
* ...Tarn,
* options: {
* min: 0,
* max: 10,
* },
* },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: {
* password: 'password',
* userName: 'username',
* },
* type: 'default',
* },
* options: {
* database: 'some_db',
* port: 1433,
* trustServerCertificate: true,
* },
* server: 'localhost',
* }),
* },
* })
* ```
*/
export declare class MssqlDialect implements Dialect {
#private;
constructor(config: MssqlDialectConfig);
/**
* Creates a driver for the dialect.
*/
createDriver(): Driver;
/**
* Creates a query compiler for the dialect.
*/
createQueryCompiler(): QueryCompiler;
/**
* Creates an adapter for the dialect.
*/
createAdapter(): DialectAdapter;
/**
* Creates a database introspector that can be used to get database metadata
* such as the table names and column names of those tables.
*
* `db` never has any plugins installed. It's created using
* {@link Kysely.withoutPlugins}.
*/
createIntrospector(db: Kysely<any>): DatabaseIntrospector;
}

View File

@@ -0,0 +1,65 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MssqlDialect = void 0;
const mssql_adapter_js_1 = require("./mssql-adapter.js");
const mssql_driver_js_1 = require("./mssql-driver.js");
const mssql_introspector_js_1 = require("./mssql-introspector.js");
const mssql_query_compiler_js_1 = require("./mssql-query-compiler.js");
/**
* MS SQL Server dialect that uses the [tedious](https://tediousjs.github.io/tedious)
* library.
*
* The constructor takes an instance of {@link MssqlDialectConfig}.
*
* ```ts
* import * as Tedious from 'tedious'
* import * as Tarn from 'tarn'
*
* const dialect = new MssqlDialect({
* tarn: {
* ...Tarn,
* options: {
* min: 0,
* max: 10,
* },
* },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: {
* password: 'password',
* userName: 'username',
* },
* type: 'default',
* },
* options: {
* database: 'some_db',
* port: 1433,
* trustServerCertificate: true,
* },
* server: 'localhost',
* }),
* },
* })
* ```
*/
class MssqlDialect {
#config;
constructor(config) {
this.#config = config;
}
createDriver() {
return new mssql_driver_js_1.MssqlDriver(this.#config);
}
createQueryCompiler() {
return new mssql_query_compiler_js_1.MssqlQueryCompiler();
}
createAdapter() {
return new mssql_adapter_js_1.MssqlAdapter();
}
createIntrospector(db) {
return new mssql_introspector_js_1.MssqlIntrospector(db);
}
}
exports.MssqlDialect = MssqlDialect;

View File

@@ -0,0 +1,59 @@
import type { DatabaseConnection, QueryResult } from '../../driver/database-connection.js';
import type { Driver, TransactionSettings } from '../../driver/driver.js';
import type { MssqlDialectConfig, Tedious, TediousConnection } from './mssql-dialect-config.js';
import { CompiledQuery } from '../../query-compiler/compiled-query.js';
declare const PRIVATE_RESET_METHOD: unique symbol;
declare const PRIVATE_DESTROY_METHOD: unique symbol;
declare const PRIVATE_VALIDATE_METHOD: unique symbol;
export declare class MssqlDriver implements Driver {
#private;
constructor(config: MssqlDialectConfig);
/**
* Initializes the driver.
*
* After calling this method the driver should be usable and `acquireConnection` etc.
* methods should be callable.
*/
init(): Promise<void>;
/**
* Acquires a new connection from the pool.
*/
acquireConnection(): Promise<DatabaseConnection>;
/**
* Begins a transaction.
*/
beginTransaction(connection: MssqlConnection, settings: TransactionSettings): Promise<void>;
/**
* Commits a transaction.
*/
commitTransaction(connection: MssqlConnection): Promise<void>;
/**
* Rolls back a transaction.
*/
rollbackTransaction(connection: MssqlConnection): Promise<void>;
savepoint(connection: MssqlConnection, savepointName: string): Promise<void>;
rollbackToSavepoint(connection: MssqlConnection, savepointName: string): Promise<void>;
/**
* Releases a connection back to the pool.
*/
releaseConnection(connection: MssqlConnection): Promise<void>;
/**
* Destroys the driver and releases all resources.
*/
destroy(): Promise<void>;
}
declare class MssqlConnection implements DatabaseConnection {
#private;
constructor(connection: TediousConnection, tedious: Tedious);
beginTransaction(settings: TransactionSettings): Promise<void>;
commitTransaction(): Promise<void>;
connect(): Promise<this>;
executeQuery<O>(compiledQuery: CompiledQuery): Promise<QueryResult<O>>;
rollbackTransaction(savepointName?: string): Promise<void>;
savepoint(savepointName: string): Promise<void>;
streamQuery<O>(compiledQuery: CompiledQuery, chunkSize: number): AsyncIterableIterator<QueryResult<O>>;
[PRIVATE_DESTROY_METHOD](): Promise<void>;
[PRIVATE_RESET_METHOD](): Promise<void>;
[PRIVATE_VALIDATE_METHOD](): Promise<boolean>;
}
export {};

View File

@@ -0,0 +1,359 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MssqlDriver = void 0;
const object_utils_js_1 = require("../../util/object-utils.js");
const compiled_query_js_1 = require("../../query-compiler/compiled-query.js");
const stack_trace_utils_js_1 = require("../../util/stack-trace-utils.js");
const random_string_js_1 = require("../../util/random-string.js");
const deferred_js_1 = require("../../util/deferred.js");
const PRIVATE_RESET_METHOD = Symbol();
const PRIVATE_DESTROY_METHOD = Symbol();
const PRIVATE_VALIDATE_METHOD = Symbol();
class MssqlDriver {
#config;
#pool;
constructor(config) {
this.#config = (0, object_utils_js_1.freeze)({ ...config });
const { tarn, tedious, validateConnections } = this.#config;
const { validateConnections: deprecatedValidateConnections, ...poolOptions } = tarn.options;
this.#pool = new tarn.Pool({
...poolOptions,
create: async () => {
const connection = await tedious.connectionFactory();
return await new MssqlConnection(connection, tedious).connect();
},
destroy: async (connection) => {
await connection[PRIVATE_DESTROY_METHOD]();
},
// @ts-ignore `tarn` accepts a function that returns a promise here, but
// the types are not aligned and it type errors.
validate: validateConnections === false ||
deprecatedValidateConnections === false
? undefined
: (connection) => connection[PRIVATE_VALIDATE_METHOD](),
});
}
async init() {
// noop
}
async acquireConnection() {
return await this.#pool.acquire().promise;
}
async beginTransaction(connection, settings) {
await connection.beginTransaction(settings);
}
async commitTransaction(connection) {
await connection.commitTransaction();
}
async rollbackTransaction(connection) {
await connection.rollbackTransaction();
}
async savepoint(connection, savepointName) {
await connection.savepoint(savepointName);
}
async rollbackToSavepoint(connection, savepointName) {
await connection.rollbackTransaction(savepointName);
}
async releaseConnection(connection) {
if (this.#config.resetConnectionsOnRelease ||
this.#config.tedious.resetConnectionOnRelease) {
await connection[PRIVATE_RESET_METHOD]();
}
this.#pool.release(connection);
}
async destroy() {
await this.#pool.destroy();
}
}
exports.MssqlDriver = MssqlDriver;
class MssqlConnection {
#connection;
#hasSocketError;
#tedious;
constructor(connection, tedious) {
this.#connection = connection;
this.#hasSocketError = false;
this.#tedious = tedious;
}
async beginTransaction(settings) {
const { isolationLevel } = settings;
await new Promise((resolve, reject) => this.#connection.beginTransaction((error) => {
if (error)
reject(error);
else
resolve(undefined);
}, isolationLevel ? (0, random_string_js_1.randomString)(8) : undefined, isolationLevel
? this.#getTediousIsolationLevel(isolationLevel)
: undefined));
}
async commitTransaction() {
await new Promise((resolve, reject) => this.#connection.commitTransaction((error) => {
if (error)
reject(error);
else
resolve(undefined);
}));
}
async connect() {
const { promise: waitForConnected, reject, resolve } = new deferred_js_1.Deferred();
this.#connection.connect((error) => {
if (error) {
return reject(error);
}
resolve();
});
this.#connection.on('error', (error) => {
if (error instanceof Error &&
'code' in error &&
error.code === 'ESOCKET') {
this.#hasSocketError = true;
}
console.error(error);
reject(error);
});
function endListener() {
reject(new Error('The connection ended without ever completing the connection'));
}
this.#connection.once('end', endListener);
await waitForConnected;
this.#connection.off('end', endListener);
return this;
}
async executeQuery(compiledQuery) {
try {
const deferred = new deferred_js_1.Deferred();
const request = new MssqlRequest({
compiledQuery,
tedious: this.#tedious,
onDone: deferred,
});
this.#connection.execSql(request.request);
const { rowCount, rows } = await deferred.promise;
return {
numAffectedRows: rowCount !== undefined ? BigInt(rowCount) : undefined,
rows,
};
}
catch (err) {
throw (0, stack_trace_utils_js_1.extendStackTrace)(err, new Error());
}
}
async rollbackTransaction(savepointName) {
await new Promise((resolve, reject) => this.#connection.rollbackTransaction((error) => {
if (error)
reject(error);
else
resolve(undefined);
}, savepointName));
}
async savepoint(savepointName) {
await new Promise((resolve, reject) => this.#connection.saveTransaction((error) => {
if (error)
reject(error);
else
resolve(undefined);
}, savepointName));
}
async *streamQuery(compiledQuery, chunkSize) {
if (!Number.isInteger(chunkSize) || chunkSize <= 0) {
throw new Error('chunkSize must be a positive integer');
}
const request = new MssqlRequest({
compiledQuery,
streamChunkSize: chunkSize,
tedious: this.#tedious,
});
this.#connection.execSql(request.request);
try {
while (true) {
const rows = await request.readChunk();
if (rows.length === 0) {
break;
}
yield { rows };
if (rows.length < chunkSize) {
break;
}
}
}
finally {
await this.#cancelRequest(request);
}
}
#getTediousIsolationLevel(isolationLevel) {
const { ISOLATION_LEVEL } = this.#tedious;
const mapper = {
'read committed': ISOLATION_LEVEL.READ_COMMITTED,
'read uncommitted': ISOLATION_LEVEL.READ_UNCOMMITTED,
'repeatable read': ISOLATION_LEVEL.REPEATABLE_READ,
serializable: ISOLATION_LEVEL.SERIALIZABLE,
snapshot: ISOLATION_LEVEL.SNAPSHOT,
};
const tediousIsolationLevel = mapper[isolationLevel];
if (tediousIsolationLevel === undefined) {
throw new Error(`Unknown isolation level: ${isolationLevel}`);
}
return tediousIsolationLevel;
}
#cancelRequest(request) {
return new Promise((resolve) => {
request.request.once('requestCompleted', resolve);
const wasCanceled = this.#connection.cancel();
if (!wasCanceled) {
request.request.off('requestCompleted', resolve);
resolve();
}
});
}
[PRIVATE_DESTROY_METHOD]() {
if ('closed' in this.#connection && this.#connection.closed) {
return Promise.resolve();
}
return new Promise((resolve) => {
this.#connection.once('end', resolve);
this.#connection.close();
});
}
async [PRIVATE_RESET_METHOD]() {
await new Promise((resolve, reject) => {
this.#connection.reset((error) => {
if (error) {
return reject(error);
}
resolve();
});
});
}
async [PRIVATE_VALIDATE_METHOD]() {
if (this.#hasSocketError || this.#isConnectionClosed()) {
return false;
}
try {
const deferred = new deferred_js_1.Deferred();
const request = new MssqlRequest({
compiledQuery: compiled_query_js_1.CompiledQuery.raw('select 1'),
onDone: deferred,
tedious: this.#tedious,
});
this.#connection.execSql(request.request);
await deferred.promise;
return true;
}
catch {
return false;
}
}
#isConnectionClosed() {
return 'closed' in this.#connection && Boolean(this.#connection.closed);
}
}
class MssqlRequest {
#request;
#rows;
#streamChunkSize;
#subscribers;
#tedious;
#rowCount;
constructor(props) {
const { compiledQuery, onDone, streamChunkSize, tedious } = props;
this.#rows = [];
this.#streamChunkSize = streamChunkSize;
this.#subscribers = {};
this.#tedious = tedious;
if (onDone) {
const subscriptionKey = 'onDone';
this.#subscribers[subscriptionKey] = (event, error) => {
if (event === 'chunkReady') {
return;
}
delete this.#subscribers[subscriptionKey];
if (event === 'error') {
return onDone.reject(error);
}
onDone.resolve({
rowCount: this.#rowCount,
rows: this.#rows,
});
};
}
this.#request = new this.#tedious.Request(compiledQuery.sql, (err, rowCount) => {
if (err) {
return Object.values(this.#subscribers).forEach((subscriber) => subscriber('error', err instanceof AggregateError ? err.errors : err));
}
this.#rowCount = rowCount;
});
this.#addParametersToRequest(compiledQuery.parameters);
this.#attachListeners();
}
get request() {
return this.#request;
}
readChunk() {
const subscriptionKey = this.readChunk.name;
return new Promise((resolve, reject) => {
this.#subscribers[subscriptionKey] = (event, error) => {
delete this.#subscribers[subscriptionKey];
if (event === 'error') {
return reject(error);
}
resolve(this.#rows.splice(0, this.#streamChunkSize));
};
this.#request.resume();
});
}
#addParametersToRequest(parameters) {
for (let i = 0; i < parameters.length; i++) {
const parameter = parameters[i];
this.#request.addParameter(String(i + 1), this.#getTediousDataType(parameter), parameter);
}
}
#attachListeners() {
const pauseAndEmitChunkReady = this.#streamChunkSize
? () => {
if (this.#streamChunkSize <= this.#rows.length) {
this.#request.pause();
Object.values(this.#subscribers).forEach((subscriber) => subscriber('chunkReady'));
}
}
: () => { };
const rowListener = (columns) => {
const row = {};
for (const column of columns) {
row[column.metadata.colName] = column.value;
}
this.#rows.push(row);
pauseAndEmitChunkReady();
};
this.#request.on('row', rowListener);
this.#request.once('requestCompleted', () => {
Object.values(this.#subscribers).forEach((subscriber) => subscriber('completed'));
this.#request.off('row', rowListener);
});
}
#getTediousDataType(value) {
if ((0, object_utils_js_1.isNull)(value) || (0, object_utils_js_1.isUndefined)(value) || (0, object_utils_js_1.isString)(value)) {
return this.#tedious.TYPES.NVarChar;
}
if ((0, object_utils_js_1.isBigInt)(value) || ((0, object_utils_js_1.isNumber)(value) && value % 1 === 0)) {
if (value < -2147483648 || value > 2147483647) {
return this.#tedious.TYPES.BigInt;
}
else {
return this.#tedious.TYPES.Int;
}
}
if ((0, object_utils_js_1.isNumber)(value)) {
return this.#tedious.TYPES.Float;
}
if ((0, object_utils_js_1.isBoolean)(value)) {
return this.#tedious.TYPES.Bit;
}
if ((0, object_utils_js_1.isDate)(value)) {
return this.#tedious.TYPES.DateTime;
}
if ((0, object_utils_js_1.isBuffer)(value)) {
return this.#tedious.TYPES.VarBinary;
}
return this.#tedious.TYPES.NVarChar;
}
}

View File

@@ -0,0 +1,20 @@
import type { Kysely } from '../../kysely.js';
import type { DatabaseIntrospector, DatabaseMetadata, DatabaseMetadataOptions, SchemaMetadata, TableMetadata } from '../database-introspector.js';
export declare class MssqlIntrospector implements DatabaseIntrospector {
#private;
constructor(db: Kysely<any>);
/**
* Get schema metadata.
*/
getSchemas(): Promise<SchemaMetadata[]>;
/**
* Get tables and views metadata.
*/
getTables(options?: DatabaseMetadataOptions): Promise<TableMetadata[]>;
/**
* Get the database metadata such as table and column names.
*
* @deprecated Use getTables() instead.
*/
getMetadata(options?: DatabaseMetadataOptions): Promise<DatabaseMetadata>;
}

View File

@@ -0,0 +1,110 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MssqlIntrospector = void 0;
const migrator_js_1 = require("../../migration/migrator.js");
const object_utils_js_1 = require("../../util/object-utils.js");
class MssqlIntrospector {
#db;
constructor(db) {
this.#db = db;
}
async getSchemas() {
return await this.#db.selectFrom('sys.schemas').select('name').execute();
}
async getTables(options = { withInternalKyselyTables: false }) {
const rawColumns = await this.#db
.selectFrom('sys.tables as tables')
.leftJoin('sys.schemas as table_schemas', 'table_schemas.schema_id', 'tables.schema_id')
.innerJoin('sys.columns as columns', 'columns.object_id', 'tables.object_id')
.innerJoin('sys.types as types', 'types.user_type_id', 'columns.user_type_id')
.leftJoin('sys.schemas as type_schemas', 'type_schemas.schema_id', 'types.schema_id')
.leftJoin('sys.extended_properties as comments', (join) => join
.onRef('comments.major_id', '=', 'tables.object_id')
.onRef('comments.minor_id', '=', 'columns.column_id')
.on('comments.name', '=', 'MS_Description'))
.$if(!options.withInternalKyselyTables, (qb) => qb
.where('tables.name', '!=', migrator_js_1.DEFAULT_MIGRATION_TABLE)
.where('tables.name', '!=', migrator_js_1.DEFAULT_MIGRATION_LOCK_TABLE))
.select([
'tables.name as table_name',
(eb) => eb
.ref('tables.type')
.$castTo()
.as('table_type'),
'table_schemas.name as table_schema_name',
'columns.default_object_id as column_default_object_id',
'columns.generated_always_type_desc as column_generated_always_type',
'columns.is_computed as column_is_computed',
'columns.is_identity as column_is_identity',
'columns.is_nullable as column_is_nullable',
'columns.is_rowguidcol as column_is_rowguidcol',
'columns.name as column_name',
'types.is_nullable as type_is_nullable',
'types.name as type_name',
'type_schemas.name as type_schema_name',
'comments.value as column_comment',
])
.unionAll(this.#db
.selectFrom('sys.views as views')
.leftJoin('sys.schemas as view_schemas', 'view_schemas.schema_id', 'views.schema_id')
.innerJoin('sys.columns as columns', 'columns.object_id', 'views.object_id')
.innerJoin('sys.types as types', 'types.user_type_id', 'columns.user_type_id')
.leftJoin('sys.schemas as type_schemas', 'type_schemas.schema_id', 'types.schema_id')
.leftJoin('sys.extended_properties as comments', (join) => join
.onRef('comments.major_id', '=', 'views.object_id')
.onRef('comments.minor_id', '=', 'columns.column_id')
.on('comments.name', '=', 'MS_Description'))
.select([
'views.name as table_name',
'views.type as table_type',
'view_schemas.name as table_schema_name',
'columns.default_object_id as column_default_object_id',
'columns.generated_always_type_desc as column_generated_always_type',
'columns.is_computed as column_is_computed',
'columns.is_identity as column_is_identity',
'columns.is_nullable as column_is_nullable',
'columns.is_rowguidcol as column_is_rowguidcol',
'columns.name as column_name',
'types.is_nullable as type_is_nullable',
'types.name as type_name',
'type_schemas.name as type_schema_name',
'comments.value as column_comment',
]))
.orderBy('table_schema_name')
.orderBy('table_name')
.orderBy('column_name')
.execute();
const tableDictionary = {};
for (const rawColumn of rawColumns) {
const key = `${rawColumn.table_schema_name}.${rawColumn.table_name}`;
const table = (tableDictionary[key] =
tableDictionary[key] ||
(0, object_utils_js_1.freeze)({
columns: [],
isView: rawColumn.table_type === 'V ',
name: rawColumn.table_name,
schema: rawColumn.table_schema_name ?? undefined,
}));
table.columns.push((0, object_utils_js_1.freeze)({
dataType: rawColumn.type_name,
dataTypeSchema: rawColumn.type_schema_name ?? undefined,
hasDefaultValue: rawColumn.column_default_object_id > 0 ||
rawColumn.column_generated_always_type !== 'NOT_APPLICABLE' ||
rawColumn.column_is_identity ||
rawColumn.column_is_computed ||
rawColumn.column_is_rowguidcol,
isAutoIncrementing: rawColumn.column_is_identity,
isNullable: rawColumn.column_is_nullable && rawColumn.type_is_nullable,
name: rawColumn.column_name,
comment: rawColumn.column_comment ?? undefined,
}));
}
return Object.values(tableDictionary);
}
async getMetadata(options) {
return {
tables: await this.getTables(options),
};
}
}
exports.MssqlIntrospector = MssqlIntrospector;

View File

@@ -0,0 +1,17 @@
import type { AddColumnNode } from '../../operation-node/add-column-node.js';
import type { AlterTableColumnAlterationNode } from '../../operation-node/alter-table-node.js';
import type { DropColumnNode } from '../../operation-node/drop-column-node.js';
import type { OffsetNode } from '../../operation-node/offset-node.js';
import type { MergeQueryNode } from '../../operation-node/merge-query-node.js';
import { DefaultQueryCompiler } from '../../query-compiler/default-query-compiler.js';
import type { CollateNode } from '../../operation-node/collate-node.js';
export declare class MssqlQueryCompiler extends DefaultQueryCompiler {
protected getCurrentParameterPlaceholder(): string;
protected visitOffset(node: OffsetNode): void;
protected compileColumnAlterations(columnAlterations: readonly AlterTableColumnAlterationNode[]): void;
protected visitAddColumn(node: AddColumnNode): void;
protected visitDropColumn(node: DropColumnNode): void;
protected visitMergeQuery(node: MergeQueryNode): void;
protected visitCollate(node: CollateNode): void;
protected announcesNewColumnDataType(): boolean;
}

View File

@@ -0,0 +1,83 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MssqlQueryCompiler = void 0;
const default_query_compiler_js_1 = require("../../query-compiler/default-query-compiler.js");
const COLLATION_CHAR_REGEX = /^[a-z0-9_]$/i;
class MssqlQueryCompiler extends default_query_compiler_js_1.DefaultQueryCompiler {
getCurrentParameterPlaceholder() {
return `@${this.numParameters}`;
}
visitOffset(node) {
super.visitOffset(node);
this.append(' rows');
}
// mssql allows multi-column alterations in a single statement,
// but you can only use the command keyword/s once.
// it also doesn't support multiple kinds of commands in the same
// alter table statement, but we compile that anyway for the sake
// of WYSIWYG.
compileColumnAlterations(columnAlterations) {
const nodesByKind = {};
for (const columnAlteration of columnAlterations) {
if (!nodesByKind[columnAlteration.kind]) {
nodesByKind[columnAlteration.kind] = [];
}
nodesByKind[columnAlteration.kind].push(columnAlteration);
}
let first = true;
if (nodesByKind.AddColumnNode) {
this.append('add ');
this.compileList(nodesByKind.AddColumnNode);
first = false;
}
// multiple of these are not really supported by mssql,
// but for the sake of WYSIWYG.
if (nodesByKind.AlterColumnNode) {
if (!first)
this.append(', ');
this.compileList(nodesByKind.AlterColumnNode);
}
if (nodesByKind.DropColumnNode) {
if (!first)
this.append(', ');
this.append('drop column ');
this.compileList(nodesByKind.DropColumnNode);
}
// not really supported by mssql, but for the sake of WYSIWYG.
if (nodesByKind.ModifyColumnNode) {
if (!first)
this.append(', ');
this.compileList(nodesByKind.ModifyColumnNode);
}
// not really supported by mssql, but for the sake of WYSIWYG.
if (nodesByKind.RenameColumnNode) {
if (!first)
this.append(', ');
this.compileList(nodesByKind.RenameColumnNode);
}
}
visitAddColumn(node) {
this.visitNode(node.column);
}
visitDropColumn(node) {
this.visitNode(node.column);
}
visitMergeQuery(node) {
super.visitMergeQuery(node);
this.append(';');
}
visitCollate(node) {
this.append('collate ');
const { name } = node.collation;
for (const char of name) {
if (!COLLATION_CHAR_REGEX.test(char)) {
throw new Error(`Invalid collation: ${name}`);
}
}
this.append(name);
}
announcesNewColumnDataType() {
return false;
}
}
exports.MssqlQueryCompiler = MssqlQueryCompiler;

View File

@@ -0,0 +1,80 @@
import type { Kysely } from '../../kysely.js';
import { DialectAdapterBase } from '../dialect-adapter-base.js';
import type { MigrationLockOptions } from '../dialect-adapter.js';
export declare class MysqlAdapter extends DialectAdapterBase {
/**
* Whether or not this dialect supports transactional DDL.
*
* If this is true, migrations are executed inside a transaction.
*/
get supportsTransactionalDdl(): boolean;
/**
* Whether or not this dialect supports the `returning` in inserts
* updates and deletes.
*/
get supportsReturning(): boolean;
/**
* This method is used to acquire a lock for the migrations so that
* it's not possible for two migration operations to run in parallel.
*
* Most dialects have explicit locks that can be used, like advisory locks
* in PostgreSQL and the get_lock function in MySQL.
*
* If the dialect doesn't have explicit locks the {@link MigrationLockOptions.lockTable}
* created by Kysely can be used instead. You can access it through the `options` object.
* The lock table has two columns `id` and `is_locked` and there's only one row in the table
* whose id is {@link MigrationLockOptions.lockRowId}. `is_locked` is an integer. Kysely
* takes care of creating the lock table and inserting the one single row to it before this
* method is executed. If the dialect supports schemas and the user has specified a custom
* schema in their migration settings, the options object also contains the schema name in
* {@link MigrationLockOptions.lockTableSchema}.
*
* Here's an example of how you might implement this method for a dialect that doesn't
* have explicit locks but supports `FOR UPDATE` row locks and transactional DDL:
*
* ```ts
* import { DialectAdapterBase, type MigrationLockOptions, Kysely } from 'kysely'
*
* export class MyAdapter extends DialectAdapterBase {
* override async acquireMigrationLock(
* db: Kysely<any>,
* options: MigrationLockOptions
* ): Promise<void> {
* const queryDb = options.lockTableSchema
* ? db.withSchema(options.lockTableSchema)
* : db
*
* // Since our imaginary dialect supports transactional DDL and has
* // row locks, we can simply take a row lock here and it will guarantee
* // all subsequent calls to this method from other transactions will
* // wait until this transaction finishes.
* await queryDb
* .selectFrom(options.lockTable)
* .selectAll()
* .where('id', '=', options.lockRowId)
* .forUpdate()
* .execute()
* }
*
* override async releaseMigrationLock() {
* // noop
* }
* }
* ```
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations will be executed. Otherwise
* `db` is a single connection (session) that will be used to execute the
* migrations.
*/
acquireMigrationLock(db: Kysely<any>, _opt: MigrationLockOptions): Promise<void>;
/**
* Releases the migration lock. See {@link acquireMigrationLock}.
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations were executed. Otherwise `db`
* is a single connection (session) that was used to execute the migrations
* and the `acquireMigrationLock` call.
*/
releaseMigrationLock(db: Kysely<any>, _opt: MigrationLockOptions): Promise<void>;
}

View File

@@ -0,0 +1,28 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MysqlAdapter = void 0;
const sql_js_1 = require("../../raw-builder/sql.js");
const dialect_adapter_base_js_1 = require("../dialect-adapter-base.js");
const LOCK_ID = 'ea586330-2c93-47c8-908d-981d9d270f9d';
const LOCK_TIMEOUT_SECONDS = 60 * 60;
class MysqlAdapter extends dialect_adapter_base_js_1.DialectAdapterBase {
get supportsTransactionalDdl() {
return false;
}
get supportsReturning() {
return false;
}
async acquireMigrationLock(db, _opt) {
// Kysely uses a single connection to run the migrations. Because of that, we
// can take a lock using `get_lock`. Locks acquired using `get_lock` get
// released when the connection is destroyed (session ends) or when the lock
// is released using `release_lock`. This way we know that the lock is either
// released by us after successfull or failed migrations OR it's released by
// MySQL if the process gets killed for some reason.
await (0, sql_js_1.sql) `select get_lock(${sql_js_1.sql.lit(LOCK_ID)}, ${sql_js_1.sql.lit(LOCK_TIMEOUT_SECONDS)})`.execute(db);
}
async releaseMigrationLock(db, _opt) {
await (0, sql_js_1.sql) `select release_lock(${sql_js_1.sql.lit(LOCK_ID)})`.execute(db);
}
}
exports.MysqlAdapter = MysqlAdapter;

View File

@@ -0,0 +1,56 @@
import type { DatabaseConnection } from '../../driver/database-connection.js';
/**
* Config for the MySQL dialect.
*
* https://github.com/sidorares/node-mysql2#using-connection-pools
*/
export interface MysqlDialectConfig {
/**
* A mysql2 Pool instance or a function that returns one.
*
* If a function is provided, it's called once when the first query is executed.
*
* https://github.com/sidorares/node-mysql2#using-connection-pools
*/
pool: MysqlPool | (() => Promise<MysqlPool>);
/**
* Called once for each created connection.
*/
onCreateConnection?: (connection: DatabaseConnection) => Promise<void>;
/**
* Called every time a connection is acquired from the connection pool.
*/
onReserveConnection?: (connection: DatabaseConnection) => Promise<void>;
}
/**
* This interface is the subset of mysql2 driver's `Pool` class that
* kysely needs.
*
* We don't use the type from `mysql2` here to not have a dependency to it.
*
* https://github.com/sidorares/node-mysql2#using-connection-pools
*/
export interface MysqlPool {
getConnection(callback: (error: unknown, connection: MysqlPoolConnection) => void): void;
end(callback: (error: unknown) => void): void;
}
export interface MysqlPoolConnection {
query(sql: string, parameters: Array<unknown>): {
stream: <T>(options: MysqlStreamOptions) => MysqlStream<T>;
};
query(sql: string, parameters: Array<unknown>, callback: (error: unknown, result: MysqlQueryResult) => void): void;
release(): void;
}
export interface MysqlStreamOptions {
highWaterMark?: number;
objectMode?: true;
}
export interface MysqlStream<T> {
[Symbol.asyncIterator](): AsyncIterableIterator<T>;
}
export interface MysqlOkPacket {
affectedRows: number;
changedRows: number;
insertId: number;
}
export type MysqlQueryResult = MysqlOkPacket | Record<string, unknown>[];

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,61 @@
import type { Driver } from '../../driver/driver.js';
import type { Kysely } from '../../kysely.js';
import type { QueryCompiler } from '../../query-compiler/query-compiler.js';
import type { Dialect } from '../dialect.js';
import type { DatabaseIntrospector } from '../database-introspector.js';
import type { DialectAdapter } from '../dialect-adapter.js';
import type { MysqlDialectConfig } from './mysql-dialect-config.js';
/**
* MySQL dialect that uses the [mysql2](https://github.com/sidorares/node-mysql2#readme) library.
*
* The constructor takes an instance of {@link MysqlDialectConfig}.
*
* ```ts
* import { createPool } from 'mysql2'
*
* new MysqlDialect({
* pool: createPool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*
* If you want the pool to only be created once it's first used, `pool`
* can be a function:
*
* ```ts
* import { createPool } from 'mysql2'
*
* new MysqlDialect({
* pool: async () => createPool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*/
export declare class MysqlDialect implements Dialect {
#private;
constructor(config: MysqlDialectConfig);
/**
* Creates a driver for the dialect.
*/
createDriver(): Driver;
/**
* Creates a query compiler for the dialect.
*/
createQueryCompiler(): QueryCompiler;
/**
* Creates an adapter for the dialect.
*/
createAdapter(): DialectAdapter;
/**
* Creates a database introspector that can be used to get database metadata
* such as the table names and column names of those tables.
*
* `db` never has any plugins installed. It's created using
* {@link Kysely.withoutPlugins}.
*/
createIntrospector(db: Kysely<any>): DatabaseIntrospector;
}

View File

@@ -0,0 +1,56 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MysqlDialect = void 0;
const mysql_driver_js_1 = require("./mysql-driver.js");
const mysql_query_compiler_js_1 = require("./mysql-query-compiler.js");
const mysql_introspector_js_1 = require("./mysql-introspector.js");
const mysql_adapter_js_1 = require("./mysql-adapter.js");
/**
* MySQL dialect that uses the [mysql2](https://github.com/sidorares/node-mysql2#readme) library.
*
* The constructor takes an instance of {@link MysqlDialectConfig}.
*
* ```ts
* import { createPool } from 'mysql2'
*
* new MysqlDialect({
* pool: createPool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*
* If you want the pool to only be created once it's first used, `pool`
* can be a function:
*
* ```ts
* import { createPool } from 'mysql2'
*
* new MysqlDialect({
* pool: async () => createPool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*/
class MysqlDialect {
#config;
constructor(config) {
this.#config = config;
}
createDriver() {
return new mysql_driver_js_1.MysqlDriver(this.#config);
}
createQueryCompiler() {
return new mysql_query_compiler_js_1.MysqlQueryCompiler();
}
createAdapter() {
return new mysql_adapter_js_1.MysqlAdapter();
}
createIntrospector(db) {
return new mysql_introspector_js_1.MysqlIntrospector(db);
}
}
exports.MysqlDialect = MysqlDialect;

View File

@@ -0,0 +1,52 @@
import type { DatabaseConnection, QueryResult } from '../../driver/database-connection.js';
import type { Driver, TransactionSettings } from '../../driver/driver.js';
import { CompiledQuery } from '../../query-compiler/compiled-query.js';
import type { QueryCompiler } from '../../query-compiler/query-compiler.js';
import type { MysqlDialectConfig, MysqlPoolConnection } from './mysql-dialect-config.js';
declare const PRIVATE_RELEASE_METHOD: unique symbol;
export declare class MysqlDriver implements Driver {
#private;
constructor(configOrPool: MysqlDialectConfig);
/**
* Initializes the driver.
*
* After calling this method the driver should be usable and `acquireConnection` etc.
* methods should be callable.
*/
init(): Promise<void>;
/**
* Acquires a new connection from the pool.
*/
acquireConnection(): Promise<DatabaseConnection>;
/**
* Begins a transaction.
*/
beginTransaction(connection: DatabaseConnection, settings: TransactionSettings): Promise<void>;
/**
* Commits a transaction.
*/
commitTransaction(connection: DatabaseConnection): Promise<void>;
/**
* Rolls back a transaction.
*/
rollbackTransaction(connection: DatabaseConnection): Promise<void>;
savepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
rollbackToSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
releaseSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
/**
* Releases a connection back to the pool.
*/
releaseConnection(connection: MysqlConnection): Promise<void>;
/**
* Destroys the driver and releases all resources.
*/
destroy(): Promise<void>;
}
declare class MysqlConnection implements DatabaseConnection {
#private;
constructor(rawConnection: MysqlPoolConnection);
executeQuery<O>(compiledQuery: CompiledQuery): Promise<QueryResult<O>>;
streamQuery<O>(compiledQuery: CompiledQuery, _chunkSize: number): AsyncIterableIterator<QueryResult<O>>;
[PRIVATE_RELEASE_METHOD](): void;
}
export {};

View File

@@ -0,0 +1,180 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MysqlDriver = void 0;
const savepoint_parser_js_1 = require("../../parser/savepoint-parser.js");
const compiled_query_js_1 = require("../../query-compiler/compiled-query.js");
const object_utils_js_1 = require("../../util/object-utils.js");
const query_id_js_1 = require("../../util/query-id.js");
const stack_trace_utils_js_1 = require("../../util/stack-trace-utils.js");
const PRIVATE_RELEASE_METHOD = Symbol();
class MysqlDriver {
#config;
#connections = new WeakMap();
#pool;
constructor(configOrPool) {
this.#config = (0, object_utils_js_1.freeze)({ ...configOrPool });
}
async init() {
this.#pool = (0, object_utils_js_1.isFunction)(this.#config.pool)
? await this.#config.pool()
: this.#config.pool;
}
async acquireConnection() {
const rawConnection = await this.#acquireConnection();
let connection = this.#connections.get(rawConnection);
if (!connection) {
connection = new MysqlConnection(rawConnection);
this.#connections.set(rawConnection, connection);
// The driver must take care of calling `onCreateConnection` when a new
// connection is created. The `mysql2` module doesn't provide an async hook
// for the connection creation. We need to call the method explicitly.
if (this.#config?.onCreateConnection) {
await this.#config.onCreateConnection(connection);
}
}
if (this.#config?.onReserveConnection) {
await this.#config.onReserveConnection(connection);
}
return connection;
}
async #acquireConnection() {
return new Promise((resolve, reject) => {
this.#pool.getConnection(async (err, rawConnection) => {
if (err) {
reject(err);
}
else {
resolve(rawConnection);
}
});
});
}
async beginTransaction(connection, settings) {
if (settings.isolationLevel || settings.accessMode) {
const parts = [];
if (settings.isolationLevel) {
parts.push(`isolation level ${settings.isolationLevel}`);
}
if (settings.accessMode) {
parts.push(settings.accessMode);
}
const sql = `set transaction ${parts.join(', ')}`;
// On MySQL this sets the isolation level of the next transaction.
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw(sql));
}
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('begin'));
}
async commitTransaction(connection) {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('commit'));
}
async rollbackTransaction(connection) {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('rollback'));
}
async savepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('savepoint', savepointName), (0, query_id_js_1.createQueryId)()));
}
async rollbackToSavepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('rollback to', savepointName), (0, query_id_js_1.createQueryId)()));
}
async releaseSavepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('release savepoint', savepointName), (0, query_id_js_1.createQueryId)()));
}
async releaseConnection(connection) {
connection[PRIVATE_RELEASE_METHOD]();
}
async destroy() {
return new Promise((resolve, reject) => {
this.#pool.end((err) => {
if (err) {
reject(err);
}
else {
resolve();
}
});
});
}
}
exports.MysqlDriver = MysqlDriver;
function isOkPacket(obj) {
return (0, object_utils_js_1.isObject)(obj) && 'insertId' in obj && 'affectedRows' in obj;
}
class MysqlConnection {
#rawConnection;
constructor(rawConnection) {
this.#rawConnection = rawConnection;
}
async executeQuery(compiledQuery) {
try {
const result = await this.#executeQuery(compiledQuery);
if (isOkPacket(result)) {
const { insertId, affectedRows, changedRows } = result;
return {
insertId: insertId !== undefined &&
insertId !== null &&
insertId.toString() !== '0'
? BigInt(insertId)
: undefined,
numAffectedRows: affectedRows !== undefined && affectedRows !== null
? BigInt(affectedRows)
: undefined,
numChangedRows: changedRows !== undefined && changedRows !== null
? BigInt(changedRows)
: undefined,
rows: [],
};
}
else if (Array.isArray(result)) {
return {
rows: result,
};
}
return {
rows: [],
};
}
catch (err) {
throw (0, stack_trace_utils_js_1.extendStackTrace)(err, new Error());
}
}
#executeQuery(compiledQuery) {
return new Promise((resolve, reject) => {
this.#rawConnection.query(compiledQuery.sql, compiledQuery.parameters, (err, result) => {
if (err) {
reject(err);
}
else {
resolve(result);
}
});
});
}
async *streamQuery(compiledQuery, _chunkSize) {
const stream = this.#rawConnection
.query(compiledQuery.sql, compiledQuery.parameters)
.stream({
objectMode: true,
});
try {
for await (const row of stream) {
yield {
rows: [row],
};
}
}
catch (ex) {
if (ex &&
typeof ex === 'object' &&
'code' in ex &&
// @ts-ignore
ex.code === 'ERR_STREAM_PREMATURE_CLOSE') {
// Most likely because of https://github.com/mysqljs/mysql/blob/master/lib/protocol/sequences/Query.js#L220
return;
}
throw ex;
}
}
[PRIVATE_RELEASE_METHOD]() {
this.#rawConnection.release();
}
}

View File

@@ -0,0 +1,20 @@
import type { DatabaseIntrospector, DatabaseMetadata, DatabaseMetadataOptions, SchemaMetadata, TableMetadata } from '../database-introspector.js';
import type { Kysely } from '../../kysely.js';
export declare class MysqlIntrospector implements DatabaseIntrospector {
#private;
constructor(db: Kysely<any>);
/**
* Get schema metadata.
*/
getSchemas(): Promise<SchemaMetadata[]>;
/**
* Get tables and views metadata.
*/
getTables(options?: DatabaseMetadataOptions): Promise<TableMetadata[]>;
/**
* Get the database metadata such as table and column names.
*
* @deprecated Use getTables() instead.
*/
getMetadata(options?: DatabaseMetadataOptions): Promise<DatabaseMetadata>;
}

View File

@@ -0,0 +1,79 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MysqlIntrospector = void 0;
const migrator_js_1 = require("../../migration/migrator.js");
const object_utils_js_1 = require("../../util/object-utils.js");
const sql_js_1 = require("../../raw-builder/sql.js");
class MysqlIntrospector {
#db;
constructor(db) {
this.#db = db;
}
async getSchemas() {
let rawSchemas = await this.#db
.selectFrom('information_schema.schemata')
.select('schema_name')
.$castTo()
.execute();
return rawSchemas.map((it) => ({ name: it.SCHEMA_NAME }));
}
async getTables(options = { withInternalKyselyTables: false }) {
let query = this.#db
.selectFrom('information_schema.columns as columns')
.innerJoin('information_schema.tables as tables', (b) => b
.onRef('columns.TABLE_CATALOG', '=', 'tables.TABLE_CATALOG')
.onRef('columns.TABLE_SCHEMA', '=', 'tables.TABLE_SCHEMA')
.onRef('columns.TABLE_NAME', '=', 'tables.TABLE_NAME'))
.select([
'columns.COLUMN_NAME',
'columns.COLUMN_DEFAULT',
'columns.TABLE_NAME',
'columns.TABLE_SCHEMA',
'tables.TABLE_TYPE',
'columns.IS_NULLABLE',
'columns.DATA_TYPE',
'columns.EXTRA',
'columns.COLUMN_COMMENT',
])
.where('columns.TABLE_SCHEMA', '=', (0, sql_js_1.sql) `database()`)
.orderBy('columns.TABLE_NAME')
.orderBy('columns.ORDINAL_POSITION')
.$castTo();
if (!options.withInternalKyselyTables) {
query = query
.where('columns.TABLE_NAME', '!=', migrator_js_1.DEFAULT_MIGRATION_TABLE)
.where('columns.TABLE_NAME', '!=', migrator_js_1.DEFAULT_MIGRATION_LOCK_TABLE);
}
const rawColumns = await query.execute();
return this.#parseTableMetadata(rawColumns);
}
async getMetadata(options) {
return {
tables: await this.getTables(options),
};
}
#parseTableMetadata(columns) {
return columns.reduce((tables, it) => {
let table = tables.find((tbl) => tbl.name === it.TABLE_NAME);
if (!table) {
table = (0, object_utils_js_1.freeze)({
name: it.TABLE_NAME,
isView: it.TABLE_TYPE === 'VIEW',
schema: it.TABLE_SCHEMA,
columns: [],
});
tables.push(table);
}
table.columns.push((0, object_utils_js_1.freeze)({
name: it.COLUMN_NAME,
dataType: it.DATA_TYPE,
isNullable: it.IS_NULLABLE === 'YES',
isAutoIncrementing: it.EXTRA.toLowerCase().includes('auto_increment'),
hasDefaultValue: it.COLUMN_DEFAULT !== null,
comment: it.COLUMN_COMMENT === '' ? undefined : it.COLUMN_COMMENT,
}));
return tables;
}, []);
}
}
exports.MysqlIntrospector = MysqlIntrospector;

View File

@@ -0,0 +1,13 @@
import type { CreateIndexNode } from '../../operation-node/create-index-node.js';
import { DefaultQueryCompiler } from '../../query-compiler/default-query-compiler.js';
export declare class MysqlQueryCompiler extends DefaultQueryCompiler {
protected getCurrentParameterPlaceholder(): string;
protected getLeftExplainOptionsWrapper(): string;
protected getExplainOptionAssignment(): string;
protected getExplainOptionsDelimiter(): string;
protected getRightExplainOptionsWrapper(): string;
protected getLeftIdentifierWrapper(): string;
protected getRightIdentifierWrapper(): string;
protected sanitizeIdentifier(identifier: string): string;
protected visitCreateIndex(node: CreateIndexNode): void;
}

View File

@@ -0,0 +1,60 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.MysqlQueryCompiler = void 0;
const default_query_compiler_js_1 = require("../../query-compiler/default-query-compiler.js");
const ID_WRAP_REGEX = /`/g;
class MysqlQueryCompiler extends default_query_compiler_js_1.DefaultQueryCompiler {
getCurrentParameterPlaceholder() {
return '?';
}
getLeftExplainOptionsWrapper() {
return '';
}
getExplainOptionAssignment() {
return '=';
}
getExplainOptionsDelimiter() {
return ' ';
}
getRightExplainOptionsWrapper() {
return '';
}
getLeftIdentifierWrapper() {
return '`';
}
getRightIdentifierWrapper() {
return '`';
}
sanitizeIdentifier(identifier) {
return identifier.replace(ID_WRAP_REGEX, '``');
}
visitCreateIndex(node) {
this.append('create ');
if (node.unique) {
this.append('unique ');
}
this.append('index ');
if (node.ifNotExists) {
this.append('if not exists ');
}
this.visitNode(node.name);
if (node.using) {
this.append(' using ');
this.visitNode(node.using);
}
if (node.table) {
this.append(' on ');
this.visitNode(node.table);
}
if (node.columns) {
this.append(' (');
this.compileList(node.columns);
this.append(')');
}
if (node.where) {
this.append(' ');
this.visitNode(node.where);
}
}
}
exports.MysqlQueryCompiler = MysqlQueryCompiler;

View File

@@ -0,0 +1,80 @@
import type { Kysely } from '../../kysely.js';
import { DialectAdapterBase } from '../dialect-adapter-base.js';
import type { MigrationLockOptions } from '../dialect-adapter.js';
export declare class PostgresAdapter extends DialectAdapterBase {
/**
* Whether or not this dialect supports transactional DDL.
*
* If this is true, migrations are executed inside a transaction.
*/
get supportsTransactionalDdl(): boolean;
/**
* Whether or not this dialect supports the `returning` in inserts
* updates and deletes.
*/
get supportsReturning(): boolean;
/**
* This method is used to acquire a lock for the migrations so that
* it's not possible for two migration operations to run in parallel.
*
* Most dialects have explicit locks that can be used, like advisory locks
* in PostgreSQL and the get_lock function in MySQL.
*
* If the dialect doesn't have explicit locks the {@link MigrationLockOptions.lockTable}
* created by Kysely can be used instead. You can access it through the `options` object.
* The lock table has two columns `id` and `is_locked` and there's only one row in the table
* whose id is {@link MigrationLockOptions.lockRowId}. `is_locked` is an integer. Kysely
* takes care of creating the lock table and inserting the one single row to it before this
* method is executed. If the dialect supports schemas and the user has specified a custom
* schema in their migration settings, the options object also contains the schema name in
* {@link MigrationLockOptions.lockTableSchema}.
*
* Here's an example of how you might implement this method for a dialect that doesn't
* have explicit locks but supports `FOR UPDATE` row locks and transactional DDL:
*
* ```ts
* import { DialectAdapterBase, type MigrationLockOptions, Kysely } from 'kysely'
*
* export class MyAdapter extends DialectAdapterBase {
* override async acquireMigrationLock(
* db: Kysely<any>,
* options: MigrationLockOptions
* ): Promise<void> {
* const queryDb = options.lockTableSchema
* ? db.withSchema(options.lockTableSchema)
* : db
*
* // Since our imaginary dialect supports transactional DDL and has
* // row locks, we can simply take a row lock here and it will guarantee
* // all subsequent calls to this method from other transactions will
* // wait until this transaction finishes.
* await queryDb
* .selectFrom(options.lockTable)
* .selectAll()
* .where('id', '=', options.lockRowId)
* .forUpdate()
* .execute()
* }
*
* override async releaseMigrationLock() {
* // noop
* }
* }
* ```
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations will be executed. Otherwise
* `db` is a single connection (session) that will be used to execute the
* migrations.
*/
acquireMigrationLock(db: Kysely<any>, _opt: MigrationLockOptions): Promise<void>;
/**
* Releases the migration lock. See {@link acquireMigrationLock}.
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations were executed. Otherwise `db`
* is a single connection (session) that was used to execute the migrations
* and the `acquireMigrationLock` call.
*/
releaseMigrationLock(_db: Kysely<any>, _opt: MigrationLockOptions): Promise<void>;
}

View File

@@ -0,0 +1,25 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.PostgresAdapter = void 0;
const sql_js_1 = require("../../raw-builder/sql.js");
const dialect_adapter_base_js_1 = require("../dialect-adapter-base.js");
// Random id for our transaction lock.
const LOCK_ID = BigInt('3853314791062309107');
class PostgresAdapter extends dialect_adapter_base_js_1.DialectAdapterBase {
get supportsTransactionalDdl() {
return true;
}
get supportsReturning() {
return true;
}
async acquireMigrationLock(db, _opt) {
// Acquire a transaction level advisory lock.
await (0, sql_js_1.sql) `select pg_advisory_xact_lock(${sql_js_1.sql.lit(LOCK_ID)})`.execute(db);
}
async releaseMigrationLock(_db, _opt) {
// Nothing to do here. `pg_advisory_xact_lock` is automatically released at the
// end of the transaction and since `supportsTransactionalDdl` true, we know
// the `db` instance passed to acquireMigrationLock is actually a transaction.
}
}
exports.PostgresAdapter = PostgresAdapter;

View File

@@ -0,0 +1,68 @@
import type { DatabaseConnection } from '../../driver/database-connection.js';
/**
* Config for the PostgreSQL dialect.
*/
export interface PostgresDialectConfig {
/**
* A postgres Pool instance or a function that returns one.
*
* If a function is provided, it's called once when the first query is executed.
*
* https://node-postgres.com/apis/pool
*/
pool: PostgresPool | (() => Promise<PostgresPool>);
/**
* https://github.com/brianc/node-postgres/tree/master/packages/pg-cursor
*
* ```ts
* import { PostgresDialect } from 'kysely'
* import { Pool } from 'pg'
* import Cursor from 'pg-cursor'
* // or import * as Cursor from 'pg-cursor'
*
* new PostgresDialect({
* cursor: Cursor,
* pool: new Pool('postgres://localhost:5432/mydb')
* })
* ```
*/
cursor?: PostgresCursorConstructor;
/**
* Called once for each created connection.
*/
onCreateConnection?: (connection: DatabaseConnection) => Promise<void>;
/**
* Called every time a connection is acquired from the pool.
*/
onReserveConnection?: (connection: DatabaseConnection) => Promise<void>;
}
/**
* This interface is the subset of pg driver's `Pool` class that
* kysely needs.
*
* We don't use the type from `pg` here to not have a dependency to it.
*
* https://node-postgres.com/apis/pool
*/
export interface PostgresPool {
connect(): Promise<PostgresPoolClient>;
end(): Promise<void>;
}
export interface PostgresPoolClient {
query<R>(sql: string, parameters: ReadonlyArray<unknown>): Promise<PostgresQueryResult<R>>;
query<R>(cursor: PostgresCursor<R>): PostgresCursor<R>;
release(): void;
}
export interface PostgresCursor<T> {
read(rowsCount: number): Promise<T[]>;
close(): Promise<void>;
}
export type PostgresCursorConstructor = new <T>(sql: string, parameters: unknown[]) => PostgresCursor<T>;
export interface PostgresQueryResult<R> {
command: 'UPDATE' | 'DELETE' | 'INSERT' | 'SELECT' | 'MERGE';
rowCount: number;
rows: R[];
}
export interface PostgresStream<T> {
[Symbol.asyncIterator](): AsyncIterableIterator<T>;
}

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,61 @@
import type { Driver } from '../../driver/driver.js';
import type { Kysely } from '../../kysely.js';
import type { QueryCompiler } from '../../query-compiler/query-compiler.js';
import type { Dialect } from '../dialect.js';
import type { DatabaseIntrospector } from '../database-introspector.js';
import type { DialectAdapter } from '../dialect-adapter.js';
import type { PostgresDialectConfig } from './postgres-dialect-config.js';
/**
* PostgreSQL dialect that uses the [pg](https://node-postgres.com/) library.
*
* The constructor takes an instance of {@link PostgresDialectConfig}.
*
* ```ts
* import { Pool } from 'pg'
*
* new PostgresDialect({
* pool: new Pool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*
* If you want the pool to only be created once it's first used, `pool`
* can be a function:
*
* ```ts
* import { Pool } from 'pg'
*
* new PostgresDialect({
* pool: async () => new Pool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*/
export declare class PostgresDialect implements Dialect {
#private;
constructor(config: PostgresDialectConfig);
/**
* Creates a driver for the dialect.
*/
createDriver(): Driver;
/**
* Creates a query compiler for the dialect.
*/
createQueryCompiler(): QueryCompiler;
/**
* Creates an adapter for the dialect.
*/
createAdapter(): DialectAdapter;
/**
* Creates a database introspector that can be used to get database metadata
* such as the table names and column names of those tables.
*
* `db` never has any plugins installed. It's created using
* {@link Kysely.withoutPlugins}.
*/
createIntrospector(db: Kysely<any>): DatabaseIntrospector;
}

View File

@@ -0,0 +1,56 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.PostgresDialect = void 0;
const postgres_driver_js_1 = require("./postgres-driver.js");
const postgres_introspector_js_1 = require("./postgres-introspector.js");
const postgres_query_compiler_js_1 = require("./postgres-query-compiler.js");
const postgres_adapter_js_1 = require("./postgres-adapter.js");
/**
* PostgreSQL dialect that uses the [pg](https://node-postgres.com/) library.
*
* The constructor takes an instance of {@link PostgresDialectConfig}.
*
* ```ts
* import { Pool } from 'pg'
*
* new PostgresDialect({
* pool: new Pool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*
* If you want the pool to only be created once it's first used, `pool`
* can be a function:
*
* ```ts
* import { Pool } from 'pg'
*
* new PostgresDialect({
* pool: async () => new Pool({
* database: 'some_db',
* host: 'localhost',
* })
* })
* ```
*/
class PostgresDialect {
#config;
constructor(config) {
this.#config = config;
}
createDriver() {
return new postgres_driver_js_1.PostgresDriver(this.#config);
}
createQueryCompiler() {
return new postgres_query_compiler_js_1.PostgresQueryCompiler();
}
createAdapter() {
return new postgres_adapter_js_1.PostgresAdapter();
}
createIntrospector(db) {
return new postgres_introspector_js_1.PostgresIntrospector(db);
}
}
exports.PostgresDialect = PostgresDialect;

View File

@@ -0,0 +1,55 @@
import type { DatabaseConnection, QueryResult } from '../../driver/database-connection.js';
import type { Driver, TransactionSettings } from '../../driver/driver.js';
import { CompiledQuery } from '../../query-compiler/compiled-query.js';
import type { QueryCompiler } from '../../query-compiler/query-compiler.js';
import type { PostgresCursorConstructor, PostgresDialectConfig, PostgresPoolClient } from './postgres-dialect-config.js';
declare const PRIVATE_RELEASE_METHOD: unique symbol;
export declare class PostgresDriver implements Driver {
#private;
constructor(config: PostgresDialectConfig);
/**
* Initializes the driver.
*
* After calling this method the driver should be usable and `acquireConnection` etc.
* methods should be callable.
*/
init(): Promise<void>;
/**
* Acquires a new connection from the pool.
*/
acquireConnection(): Promise<DatabaseConnection>;
/**
* Begins a transaction.
*/
beginTransaction(connection: DatabaseConnection, settings: TransactionSettings): Promise<void>;
/**
* Commits a transaction.
*/
commitTransaction(connection: DatabaseConnection): Promise<void>;
/**
* Rolls back a transaction.
*/
rollbackTransaction(connection: DatabaseConnection): Promise<void>;
savepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
rollbackToSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
releaseSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
/**
* Releases a connection back to the pool.
*/
releaseConnection(connection: PostgresConnection): Promise<void>;
/**
* Destroys the driver and releases all resources.
*/
destroy(): Promise<void>;
}
interface PostgresConnectionOptions {
cursor: PostgresCursorConstructor | null;
}
declare class PostgresConnection implements DatabaseConnection {
#private;
constructor(client: PostgresPoolClient, options: PostgresConnectionOptions);
executeQuery<O>(compiledQuery: CompiledQuery): Promise<QueryResult<O>>;
streamQuery<O>(compiledQuery: CompiledQuery, chunkSize: number): AsyncIterableIterator<QueryResult<O>>;
[PRIVATE_RELEASE_METHOD](): void;
}
export {};

View File

@@ -0,0 +1,134 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.PostgresDriver = void 0;
const savepoint_parser_js_1 = require("../../parser/savepoint-parser.js");
const compiled_query_js_1 = require("../../query-compiler/compiled-query.js");
const object_utils_js_1 = require("../../util/object-utils.js");
const query_id_js_1 = require("../../util/query-id.js");
const stack_trace_utils_js_1 = require("../../util/stack-trace-utils.js");
const PRIVATE_RELEASE_METHOD = Symbol();
class PostgresDriver {
#config;
#connections = new WeakMap();
#pool;
constructor(config) {
this.#config = (0, object_utils_js_1.freeze)({ ...config });
}
async init() {
this.#pool = (0, object_utils_js_1.isFunction)(this.#config.pool)
? await this.#config.pool()
: this.#config.pool;
}
async acquireConnection() {
const client = await this.#pool.connect();
let connection = this.#connections.get(client);
if (!connection) {
connection = new PostgresConnection(client, {
cursor: this.#config.cursor ?? null,
});
this.#connections.set(client, connection);
// The driver must take care of calling `onCreateConnection` when a new
// connection is created. The `pg` module doesn't provide an async hook
// for the connection creation. We need to call the method explicitly.
if (this.#config.onCreateConnection) {
await this.#config.onCreateConnection(connection);
}
}
if (this.#config.onReserveConnection) {
await this.#config.onReserveConnection(connection);
}
return connection;
}
async beginTransaction(connection, settings) {
if (settings.isolationLevel || settings.accessMode) {
let sql = 'start transaction';
if (settings.isolationLevel) {
sql += ` isolation level ${settings.isolationLevel}`;
}
if (settings.accessMode) {
sql += ` ${settings.accessMode}`;
}
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw(sql));
}
else {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('begin'));
}
}
async commitTransaction(connection) {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('commit'));
}
async rollbackTransaction(connection) {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('rollback'));
}
async savepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('savepoint', savepointName), (0, query_id_js_1.createQueryId)()));
}
async rollbackToSavepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('rollback to', savepointName), (0, query_id_js_1.createQueryId)()));
}
async releaseSavepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('release', savepointName), (0, query_id_js_1.createQueryId)()));
}
async releaseConnection(connection) {
connection[PRIVATE_RELEASE_METHOD]();
}
async destroy() {
if (this.#pool) {
const pool = this.#pool;
this.#pool = undefined;
await pool.end();
}
}
}
exports.PostgresDriver = PostgresDriver;
class PostgresConnection {
#client;
#options;
constructor(client, options) {
this.#client = client;
this.#options = options;
}
async executeQuery(compiledQuery) {
try {
const { command, rowCount, rows } = await this.#client.query(compiledQuery.sql, [...compiledQuery.parameters]);
return {
numAffectedRows: command === 'INSERT' ||
command === 'UPDATE' ||
command === 'DELETE' ||
command === 'MERGE'
? BigInt(rowCount)
: undefined,
rows: rows ?? [],
};
}
catch (err) {
throw (0, stack_trace_utils_js_1.extendStackTrace)(err, new Error());
}
}
async *streamQuery(compiledQuery, chunkSize) {
if (!this.#options.cursor) {
throw new Error("'cursor' is not present in your postgres dialect config. It's required to make streaming work in postgres.");
}
if (!Number.isInteger(chunkSize) || chunkSize <= 0) {
throw new Error('chunkSize must be a positive integer');
}
const cursor = this.#client.query(new this.#options.cursor(compiledQuery.sql, compiledQuery.parameters.slice()));
try {
while (true) {
const rows = await cursor.read(chunkSize);
if (rows.length === 0) {
break;
}
yield {
rows,
};
}
}
finally {
await cursor.close();
}
}
[PRIVATE_RELEASE_METHOD]() {
this.#client.release();
}
}

View File

@@ -0,0 +1,20 @@
import type { DatabaseIntrospector, DatabaseMetadata, DatabaseMetadataOptions, SchemaMetadata, TableMetadata } from '../database-introspector.js';
import type { Kysely } from '../../kysely.js';
export declare class PostgresIntrospector implements DatabaseIntrospector {
#private;
constructor(db: Kysely<any>);
/**
* Get schema metadata.
*/
getSchemas(): Promise<SchemaMetadata[]>;
/**
* Get tables and views metadata.
*/
getTables(options?: DatabaseMetadataOptions): Promise<TableMetadata[]>;
/**
* Get the database metadata such as table and column names.
*
* @deprecated Use getTables() instead.
*/
getMetadata(options?: DatabaseMetadataOptions): Promise<DatabaseMetadata>;
}

View File

@@ -0,0 +1,100 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.PostgresIntrospector = void 0;
const migrator_js_1 = require("../../migration/migrator.js");
const object_utils_js_1 = require("../../util/object-utils.js");
const sql_js_1 = require("../../raw-builder/sql.js");
class PostgresIntrospector {
#db;
constructor(db) {
this.#db = db;
}
async getSchemas() {
let rawSchemas = await this.#db
.selectFrom('pg_catalog.pg_namespace')
.select('nspname')
.$castTo()
.execute();
return rawSchemas.map((it) => ({ name: it.nspname }));
}
async getTables(options = { withInternalKyselyTables: false }) {
let query = this.#db
// column
.selectFrom('pg_catalog.pg_attribute as a')
// table
.innerJoin('pg_catalog.pg_class as c', 'a.attrelid', 'c.oid')
// table schema
.innerJoin('pg_catalog.pg_namespace as ns', 'c.relnamespace', 'ns.oid')
// column data type
.innerJoin('pg_catalog.pg_type as typ', 'a.atttypid', 'typ.oid')
// column data type schema
.innerJoin('pg_catalog.pg_namespace as dtns', 'typ.typnamespace', 'dtns.oid')
.select([
'a.attname as column',
'a.attnotnull as not_null',
'a.atthasdef as has_default',
'c.relname as table',
'c.relkind as table_type',
'ns.nspname as schema',
'typ.typname as type',
'dtns.nspname as type_schema',
(0, sql_js_1.sql) `col_description(a.attrelid, a.attnum)`.as('column_description'),
(0, sql_js_1.sql) `pg_get_serial_sequence(quote_ident(ns.nspname) || '.' || quote_ident(c.relname), a.attname)`.as('auto_incrementing'),
])
.where('c.relkind', 'in', [
'r' /*regular table*/,
'v' /*view*/,
'p' /*partitioned table*/,
])
.where('ns.nspname', '!~', '^pg_')
.where('ns.nspname', '!=', 'information_schema')
// Filter out internal cockroachdb schema
.where('ns.nspname', '!=', 'crdb_internal')
// Only schemas where we are allowed access
.where((0, sql_js_1.sql) `has_schema_privilege(ns.nspname, 'USAGE')`)
// No system columns
.where('a.attnum', '>=', 0)
.where('a.attisdropped', '!=', true)
.orderBy('ns.nspname')
.orderBy('c.relname')
.orderBy('a.attnum')
.$castTo();
if (!options.withInternalKyselyTables) {
query = query
.where('c.relname', '!=', migrator_js_1.DEFAULT_MIGRATION_TABLE)
.where('c.relname', '!=', migrator_js_1.DEFAULT_MIGRATION_LOCK_TABLE);
}
const rawColumns = await query.execute();
return this.#parseTableMetadata(rawColumns);
}
async getMetadata(options) {
return {
tables: await this.getTables(options),
};
}
#parseTableMetadata(columns) {
return columns.reduce((tables, it) => {
let table = tables.find((tbl) => tbl.name === it.table && tbl.schema === it.schema);
if (!table) {
table = (0, object_utils_js_1.freeze)({
name: it.table,
isView: it.table_type === 'v',
schema: it.schema,
columns: [],
});
tables.push(table);
}
table.columns.push((0, object_utils_js_1.freeze)({
name: it.column,
dataType: it.type,
dataTypeSchema: it.type_schema,
isNullable: !it.not_null,
isAutoIncrementing: it.auto_incrementing !== null,
hasDefaultValue: it.has_default,
comment: it.column_description ?? undefined,
}));
return tables;
}, []);
}
}
exports.PostgresIntrospector = PostgresIntrospector;

View File

@@ -0,0 +1,4 @@
import { DefaultQueryCompiler } from '../../query-compiler/default-query-compiler.js';
export declare class PostgresQueryCompiler extends DefaultQueryCompiler {
protected sanitizeIdentifier(identifier: string): string;
}

View File

@@ -0,0 +1,11 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.PostgresQueryCompiler = void 0;
const default_query_compiler_js_1 = require("../../query-compiler/default-query-compiler.js");
const ID_WRAP_REGEX = /"/g;
class PostgresQueryCompiler extends default_query_compiler_js_1.DefaultQueryCompiler {
sanitizeIdentifier(identifier) {
return identifier.replace(ID_WRAP_REGEX, '""');
}
}
exports.PostgresQueryCompiler = PostgresQueryCompiler;

View File

@@ -0,0 +1,80 @@
import type { Kysely } from '../../kysely.js';
import { DialectAdapterBase } from '../dialect-adapter-base.js';
import type { MigrationLockOptions } from '../dialect-adapter.js';
export declare class SqliteAdapter extends DialectAdapterBase {
/**
* Whether or not this dialect supports transactional DDL.
*
* If this is true, migrations are executed inside a transaction.
*/
get supportsTransactionalDdl(): boolean;
/**
* Whether or not this dialect supports the `returning` in inserts
* updates and deletes.
*/
get supportsReturning(): boolean;
/**
* This method is used to acquire a lock for the migrations so that
* it's not possible for two migration operations to run in parallel.
*
* Most dialects have explicit locks that can be used, like advisory locks
* in PostgreSQL and the get_lock function in MySQL.
*
* If the dialect doesn't have explicit locks the {@link MigrationLockOptions.lockTable}
* created by Kysely can be used instead. You can access it through the `options` object.
* The lock table has two columns `id` and `is_locked` and there's only one row in the table
* whose id is {@link MigrationLockOptions.lockRowId}. `is_locked` is an integer. Kysely
* takes care of creating the lock table and inserting the one single row to it before this
* method is executed. If the dialect supports schemas and the user has specified a custom
* schema in their migration settings, the options object also contains the schema name in
* {@link MigrationLockOptions.lockTableSchema}.
*
* Here's an example of how you might implement this method for a dialect that doesn't
* have explicit locks but supports `FOR UPDATE` row locks and transactional DDL:
*
* ```ts
* import { DialectAdapterBase, type MigrationLockOptions, Kysely } from 'kysely'
*
* export class MyAdapter extends DialectAdapterBase {
* override async acquireMigrationLock(
* db: Kysely<any>,
* options: MigrationLockOptions
* ): Promise<void> {
* const queryDb = options.lockTableSchema
* ? db.withSchema(options.lockTableSchema)
* : db
*
* // Since our imaginary dialect supports transactional DDL and has
* // row locks, we can simply take a row lock here and it will guarantee
* // all subsequent calls to this method from other transactions will
* // wait until this transaction finishes.
* await queryDb
* .selectFrom(options.lockTable)
* .selectAll()
* .where('id', '=', options.lockRowId)
* .forUpdate()
* .execute()
* }
*
* override async releaseMigrationLock() {
* // noop
* }
* }
* ```
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations will be executed. Otherwise
* `db` is a single connection (session) that will be used to execute the
* migrations.
*/
acquireMigrationLock(_db: Kysely<any>, _opt: MigrationLockOptions): Promise<void>;
/**
* Releases the migration lock. See {@link acquireMigrationLock}.
*
* If `supportsTransactionalDdl` is `true` then the `db` passed to this method
* is a transaction inside which the migrations were executed. Otherwise `db`
* is a single connection (session) that was used to execute the migrations
* and the `acquireMigrationLock` call.
*/
releaseMigrationLock(_db: Kysely<any>, _opt: MigrationLockOptions): Promise<void>;
}

View File

@@ -0,0 +1,23 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.SqliteAdapter = void 0;
const dialect_adapter_base_js_1 = require("../dialect-adapter-base.js");
class SqliteAdapter extends dialect_adapter_base_js_1.DialectAdapterBase {
get supportsTransactionalDdl() {
return false;
}
get supportsReturning() {
return true;
}
async acquireMigrationLock(_db, _opt) {
// SQLite only has one connection that's reserved by the migration system
// for the whole time between acquireMigrationLock and releaseMigrationLock.
// We don't need to do anything here.
}
async releaseMigrationLock(_db, _opt) {
// SQLite only has one connection that's reserved by the migration system
// for the whole time between acquireMigrationLock and releaseMigrationLock.
// We don't need to do anything here.
}
}
exports.SqliteAdapter = SqliteAdapter;

View File

@@ -0,0 +1,41 @@
import type { DatabaseConnection } from '../../driver/database-connection.js';
/**
* Config for the SQLite dialect.
*/
export interface SqliteDialectConfig {
/**
* An sqlite Database instance or a function that returns one.
*
* If a function is provided, it's called once when the first query is executed.
*
* https://github.com/JoshuaWise/better-sqlite3/blob/master/docs/api.md#new-databasepath-options
*/
database: SqliteDatabase | (() => Promise<SqliteDatabase>);
/**
* Called once when the first query is executed.
*
* This is a Kysely specific feature and does not come from the `better-sqlite3` module.
*/
onCreateConnection?: (connection: DatabaseConnection) => Promise<void>;
}
/**
* This interface is the subset of better-sqlite3 driver's `Database` class that
* kysely needs.
*
* We don't use the type from `better-sqlite3` here to not have a dependency to it.
*
* https://github.com/JoshuaWise/better-sqlite3/blob/master/docs/api.md#new-databasepath-options
*/
export interface SqliteDatabase {
close(): void;
prepare(sql: string): SqliteStatement;
}
export interface SqliteStatement {
readonly reader: boolean;
all(parameters: ReadonlyArray<unknown>): unknown[];
run(parameters: ReadonlyArray<unknown>): {
changes: number | bigint;
lastInsertRowid: number | bigint;
};
iterate(parameters: ReadonlyArray<unknown>): IterableIterator<unknown>;
}

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,55 @@
import type { Driver } from '../../driver/driver.js';
import type { Kysely } from '../../kysely.js';
import type { QueryCompiler } from '../../query-compiler/query-compiler.js';
import type { Dialect } from '../dialect.js';
import type { DatabaseIntrospector } from '../database-introspector.js';
import type { DialectAdapter } from '../dialect-adapter.js';
import type { SqliteDialectConfig } from './sqlite-dialect-config.js';
/**
* SQLite dialect that uses the [better-sqlite3](https://github.com/JoshuaWise/better-sqlite3) library.
*
* The constructor takes an instance of {@link SqliteDialectConfig}.
*
* ```ts
* import Database from 'better-sqlite3'
*
* new SqliteDialect({
* database: new Database('db.sqlite')
* })
* ```
*
* If you want the pool to only be created once it's first used, `database`
* can be a function:
*
* ```ts
* import Database from 'better-sqlite3'
*
* new SqliteDialect({
* database: async () => new Database('db.sqlite')
* })
* ```
*/
export declare class SqliteDialect implements Dialect {
#private;
constructor(config: SqliteDialectConfig);
/**
* Creates a driver for the dialect.
*/
createDriver(): Driver;
/**
* Creates a query compiler for the dialect.
*/
createQueryCompiler(): QueryCompiler;
/**
* Creates an adapter for the dialect.
*/
createAdapter(): DialectAdapter;
/**
* Creates a database introspector that can be used to get database metadata
* such as the table names and column names of those tables.
*
* `db` never has any plugins installed. It's created using
* {@link Kysely.withoutPlugins}.
*/
createIntrospector(db: Kysely<any>): DatabaseIntrospector;
}

View File

@@ -0,0 +1,51 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.SqliteDialect = void 0;
const sqlite_driver_js_1 = require("./sqlite-driver.js");
const sqlite_query_compiler_js_1 = require("./sqlite-query-compiler.js");
const sqlite_introspector_js_1 = require("./sqlite-introspector.js");
const sqlite_adapter_js_1 = require("./sqlite-adapter.js");
const object_utils_js_1 = require("../../util/object-utils.js");
/**
* SQLite dialect that uses the [better-sqlite3](https://github.com/JoshuaWise/better-sqlite3) library.
*
* The constructor takes an instance of {@link SqliteDialectConfig}.
*
* ```ts
* import Database from 'better-sqlite3'
*
* new SqliteDialect({
* database: new Database('db.sqlite')
* })
* ```
*
* If you want the pool to only be created once it's first used, `database`
* can be a function:
*
* ```ts
* import Database from 'better-sqlite3'
*
* new SqliteDialect({
* database: async () => new Database('db.sqlite')
* })
* ```
*/
class SqliteDialect {
#config;
constructor(config) {
this.#config = (0, object_utils_js_1.freeze)({ ...config });
}
createDriver() {
return new sqlite_driver_js_1.SqliteDriver(this.#config);
}
createQueryCompiler() {
return new sqlite_query_compiler_js_1.SqliteQueryCompiler();
}
createAdapter() {
return new sqlite_adapter_js_1.SqliteAdapter();
}
createIntrospector(db) {
return new sqlite_introspector_js_1.SqliteIntrospector(db);
}
}
exports.SqliteDialect = SqliteDialect;

View File

@@ -0,0 +1,42 @@
import type { DatabaseConnection } from '../../driver/database-connection.js';
import type { Driver } from '../../driver/driver.js';
import type { QueryCompiler } from '../../query-compiler/query-compiler.js';
import type { SqliteDialectConfig } from './sqlite-dialect-config.js';
export declare class SqliteDriver implements Driver {
#private;
constructor(config: SqliteDialectConfig);
/**
* Initializes the driver.
*
* After calling this method the driver should be usable and `acquireConnection` etc.
* methods should be callable.
*/
init(): Promise<void>;
/**
* Acquires a new connection from the pool.
*/
acquireConnection(): Promise<DatabaseConnection>;
/**
* Begins a transaction.
*/
beginTransaction(connection: DatabaseConnection): Promise<void>;
/**
* Commits a transaction.
*/
commitTransaction(connection: DatabaseConnection): Promise<void>;
/**
* Rolls back a transaction.
*/
rollbackTransaction(connection: DatabaseConnection): Promise<void>;
savepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
rollbackToSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
releaseSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
/**
* Releases a connection back to the pool.
*/
releaseConnection(): Promise<void>;
/**
* Destroys the driver and releases all resources.
*/
destroy(): Promise<void>;
}

View File

@@ -0,0 +1,113 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.SqliteDriver = void 0;
const select_query_node_js_1 = require("../../operation-node/select-query-node.js");
const savepoint_parser_js_1 = require("../../parser/savepoint-parser.js");
const compiled_query_js_1 = require("../../query-compiler/compiled-query.js");
const object_utils_js_1 = require("../../util/object-utils.js");
const query_id_js_1 = require("../../util/query-id.js");
class SqliteDriver {
#config;
#connectionMutex = new ConnectionMutex();
#db;
#connection;
constructor(config) {
this.#config = (0, object_utils_js_1.freeze)({ ...config });
}
async init() {
this.#db = (0, object_utils_js_1.isFunction)(this.#config.database)
? await this.#config.database()
: this.#config.database;
this.#connection = new SqliteConnection(this.#db);
if (this.#config.onCreateConnection) {
await this.#config.onCreateConnection(this.#connection);
}
}
async acquireConnection() {
// SQLite only has one single connection. We use a mutex here to wait
// until the single connection has been released.
await this.#connectionMutex.lock();
return this.#connection;
}
async beginTransaction(connection) {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('begin'));
}
async commitTransaction(connection) {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('commit'));
}
async rollbackTransaction(connection) {
await connection.executeQuery(compiled_query_js_1.CompiledQuery.raw('rollback'));
}
async savepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('savepoint', savepointName), (0, query_id_js_1.createQueryId)()));
}
async rollbackToSavepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('rollback to', savepointName), (0, query_id_js_1.createQueryId)()));
}
async releaseSavepoint(connection, savepointName, compileQuery) {
await connection.executeQuery(compileQuery((0, savepoint_parser_js_1.parseSavepointCommand)('release', savepointName), (0, query_id_js_1.createQueryId)()));
}
async releaseConnection() {
this.#connectionMutex.unlock();
}
async destroy() {
this.#db?.close();
}
}
exports.SqliteDriver = SqliteDriver;
class SqliteConnection {
#db;
constructor(db) {
this.#db = db;
}
executeQuery(compiledQuery) {
const { sql, parameters } = compiledQuery;
const stmt = this.#db.prepare(sql);
if (stmt.reader) {
return Promise.resolve({
rows: stmt.all(parameters),
});
}
const { changes, lastInsertRowid } = stmt.run(parameters);
return Promise.resolve({
numAffectedRows: changes !== undefined && changes !== null ? BigInt(changes) : undefined,
insertId: lastInsertRowid !== undefined && lastInsertRowid !== null
? BigInt(lastInsertRowid)
: undefined,
rows: [],
});
}
async *streamQuery(compiledQuery, _chunkSize) {
const { sql, parameters, query } = compiledQuery;
const stmt = this.#db.prepare(sql);
if (select_query_node_js_1.SelectQueryNode.is(query)) {
const iter = stmt.iterate(parameters);
for (const row of iter) {
yield {
rows: [row],
};
}
}
else {
throw new Error('Sqlite driver only supports streaming of select queries');
}
}
}
class ConnectionMutex {
#promise;
#resolve;
async lock() {
while (this.#promise) {
await this.#promise;
}
this.#promise = new Promise((resolve) => {
this.#resolve = resolve;
});
}
unlock() {
const resolve = this.#resolve;
this.#promise = undefined;
this.#resolve = undefined;
resolve?.();
}
}

View File

@@ -0,0 +1,20 @@
import type { DatabaseIntrospector, DatabaseMetadata, DatabaseMetadataOptions, SchemaMetadata, TableMetadata } from '../database-introspector.js';
import type { Kysely } from '../../kysely.js';
export declare class SqliteIntrospector implements DatabaseIntrospector {
#private;
constructor(db: Kysely<any>);
/**
* Get schema metadata.
*/
getSchemas(): Promise<SchemaMetadata[]>;
/**
* Get tables and views metadata.
*/
getTables(options?: DatabaseMetadataOptions): Promise<TableMetadata[]>;
/**
* Get the database metadata such as table and column names.
*
* @deprecated Use getTables() instead.
*/
getMetadata(options?: DatabaseMetadataOptions): Promise<DatabaseMetadata>;
}

View File

@@ -0,0 +1,94 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.SqliteIntrospector = void 0;
const migrator_js_1 = require("../../migration/migrator.js");
const sql_js_1 = require("../../raw-builder/sql.js");
class SqliteIntrospector {
#db;
constructor(db) {
this.#db = db;
}
async getSchemas() {
// Sqlite doesn't support schemas.
return [];
}
async getTables(options = { withInternalKyselyTables: false }) {
return await this.#getTableMetadata(options);
}
async getMetadata(options) {
return {
tables: await this.getTables(options),
};
}
#tablesQuery(qb, options) {
let tablesQuery = qb
.selectFrom('sqlite_master')
.where('type', 'in', ['table', 'view'])
.where('name', 'not like', 'sqlite_%')
.select(['name', 'sql', 'type'])
.orderBy('name');
if (!options.withInternalKyselyTables) {
tablesQuery = tablesQuery
.where('name', '!=', migrator_js_1.DEFAULT_MIGRATION_TABLE)
.where('name', '!=', migrator_js_1.DEFAULT_MIGRATION_LOCK_TABLE);
}
return tablesQuery;
}
async #getTableMetadata(options) {
const tablesResult = await this.#tablesQuery(this.#db, options).execute();
const tableMetadata = await this.#db
.with('table_list', (qb) => this.#tablesQuery(qb, options))
.selectFrom([
'table_list as tl',
(0, sql_js_1.sql) `pragma_table_info(tl.name)`.as('p'),
])
.select([
'tl.name as table',
'p.cid',
'p.name',
'p.type',
'p.notnull',
'p.dflt_value',
'p.pk',
])
.orderBy('tl.name')
.orderBy('p.cid')
.execute();
const columnsByTable = {};
for (const row of tableMetadata) {
columnsByTable[row.table] ??= [];
columnsByTable[row.table].push(row);
}
return tablesResult.map(({ name, sql, type }) => {
// // Try to find the name of the column that has `autoincrement` 🤦
let autoIncrementCol = sql
?.split(/[\(\),]/)
?.find((it) => it.toLowerCase().includes('autoincrement'))
?.trimStart()
?.split(/\s+/)?.[0]
?.replace(/["`]/g, '');
const columns = columnsByTable[name] ?? [];
// Otherwise, check for an INTEGER PRIMARY KEY
// https://www.sqlite.org/autoinc.html
if (!autoIncrementCol) {
const pkCols = columns.filter((r) => r.pk > 0);
if (pkCols.length === 1 && pkCols[0].type.toLowerCase() === 'integer') {
autoIncrementCol = pkCols[0].name;
}
}
return {
name: name,
isView: type === 'view',
columns: columns.map((col) => ({
name: col.name,
dataType: col.type,
isNullable: !col.notnull,
isAutoIncrementing: col.name === autoIncrementCol,
hasDefaultValue: col.dflt_value != null,
comment: undefined,
})),
};
});
}
}
exports.SqliteIntrospector = SqliteIntrospector;

View File

@@ -0,0 +1,14 @@
import type { DefaultInsertValueNode } from '../../operation-node/default-insert-value-node.js';
import type { OrActionNode } from '../../operation-node/or-action-node.js';
import { DefaultQueryCompiler } from '../../query-compiler/default-query-compiler.js';
export declare class SqliteQueryCompiler extends DefaultQueryCompiler {
protected visitOrAction(node: OrActionNode): void;
protected getCurrentParameterPlaceholder(): string;
protected getLeftExplainOptionsWrapper(): string;
protected getRightExplainOptionsWrapper(): string;
protected getLeftIdentifierWrapper(): string;
protected getRightIdentifierWrapper(): string;
protected getAutoIncrement(): string;
protected sanitizeIdentifier(identifier: string): string;
protected visitDefaultInsertValue(_: DefaultInsertValueNode): void;
}

View File

@@ -0,0 +1,37 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.SqliteQueryCompiler = void 0;
const default_query_compiler_js_1 = require("../../query-compiler/default-query-compiler.js");
const ID_WRAP_REGEX = /"/g;
class SqliteQueryCompiler extends default_query_compiler_js_1.DefaultQueryCompiler {
visitOrAction(node) {
this.append('or ');
this.append(node.action);
}
getCurrentParameterPlaceholder() {
return '?';
}
getLeftExplainOptionsWrapper() {
return '';
}
getRightExplainOptionsWrapper() {
return '';
}
getLeftIdentifierWrapper() {
return '"';
}
getRightIdentifierWrapper() {
return '"';
}
getAutoIncrement() {
return 'autoincrement';
}
sanitizeIdentifier(identifier) {
return identifier.replace(ID_WRAP_REGEX, '""');
}
visitDefaultInsertValue(_) {
// sqlite doesn't support the `default` keyword in inserts.
this.append('null');
}
}
exports.SqliteQueryCompiler = SqliteQueryCompiler;

View File

@@ -0,0 +1,8 @@
import type { DatabaseConnection } from './database-connection.js';
export interface ConnectionProvider {
/**
* Provides a connection for the callback and takes care of disposing
* the connection after the callback has been run.
*/
provideConnection<T>(consumer: (connection: DatabaseConnection) => Promise<T>): Promise<T>;
}

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,35 @@
import type { CompiledQuery } from '../query-compiler/compiled-query.js';
/**
* A single connection to the database engine.
*
* These are created by an instance of {@link Driver}.
*/
export interface DatabaseConnection {
executeQuery<R>(compiledQuery: CompiledQuery): Promise<QueryResult<R>>;
streamQuery<R>(compiledQuery: CompiledQuery, chunkSize?: number): AsyncIterableIterator<QueryResult<R>>;
}
export interface QueryResult<O> {
/**
* This is defined for insert, update, delete and merge queries and contains
* the number of rows the query inserted/updated/deleted.
*/
readonly numAffectedRows?: bigint;
/**
* This is defined for update queries and contains the number of rows
* the query changed.
*
* This is **optional** and only provided in dialects such as MySQL.
* You would probably use {@link numAffectedRows} in most cases.
*/
readonly numChangedRows?: bigint;
/**
* This is defined for insert queries on dialects that return
* the auto incrementing primary key from an insert.
*/
readonly insertId?: bigint;
/**
* The rows returned by the query. This is always defined and is
* empty if the query returned no rows.
*/
readonly rows: O[];
}

View File

@@ -0,0 +1,2 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });

View File

@@ -0,0 +1,12 @@
import type { DatabaseConnection } from './database-connection.js';
import type { ConnectionProvider } from './connection-provider.js';
import type { Driver } from './driver.js';
export declare class DefaultConnectionProvider implements ConnectionProvider {
#private;
constructor(driver: Driver);
/**
* Provides a connection for the callback and takes care of disposing
* the connection after the callback has been run.
*/
provideConnection<T>(consumer: (connection: DatabaseConnection) => Promise<T>): Promise<T>;
}

View File

@@ -0,0 +1,19 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.DefaultConnectionProvider = void 0;
class DefaultConnectionProvider {
#driver;
constructor(driver) {
this.#driver = driver;
}
async provideConnection(consumer) {
const connection = await this.#driver.acquireConnection();
try {
return await consumer(connection);
}
finally {
await this.#driver.releaseConnection(connection);
}
}
}
exports.DefaultConnectionProvider = DefaultConnectionProvider;

61
node_modules/kysely/dist/cjs/driver/driver.d.ts generated vendored Normal file
View File

@@ -0,0 +1,61 @@
import type { QueryCompiler } from '../query-compiler/query-compiler.js';
import type { ArrayItemType } from '../util/type-utils.js';
import type { DatabaseConnection } from './database-connection.js';
/**
* A Driver creates and releases {@link DatabaseConnection | database connections}
* and is also responsible for connection pooling (if the dialect supports pooling).
*/
export interface Driver {
/**
* Initializes the driver.
*
* After calling this method the driver should be usable and `acquireConnection` etc.
* methods should be callable.
*/
init(): Promise<void>;
/**
* Acquires a new connection from the pool.
*/
acquireConnection(): Promise<DatabaseConnection>;
/**
* Begins a transaction.
*/
beginTransaction(connection: DatabaseConnection, settings: TransactionSettings): Promise<void>;
/**
* Commits a transaction.
*/
commitTransaction(connection: DatabaseConnection): Promise<void>;
/**
* Rolls back a transaction.
*/
rollbackTransaction(connection: DatabaseConnection): Promise<void>;
/**
* Establishses a new savepoint within a transaction.
*/
savepoint?(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
/**
* Rolls back to a savepoint within a transaction.
*/
rollbackToSavepoint?(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
/**
* Releases a savepoint within a transaction.
*/
releaseSavepoint?(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
/**
* Releases a connection back to the pool.
*/
releaseConnection(connection: DatabaseConnection): Promise<void>;
/**
* Destroys the driver and releases all resources.
*/
destroy(): Promise<void>;
}
export interface TransactionSettings {
readonly accessMode?: AccessMode;
readonly isolationLevel?: IsolationLevel;
}
export declare const TRANSACTION_ACCESS_MODES: readonly ["read only", "read write"];
export type AccessMode = ArrayItemType<typeof TRANSACTION_ACCESS_MODES>;
export declare const TRANSACTION_ISOLATION_LEVELS: readonly ["read uncommitted", "read committed", "repeatable read", "serializable", "snapshot"];
export type IsolationLevel = ArrayItemType<typeof TRANSACTION_ISOLATION_LEVELS>;
export declare function validateTransactionSettings(settings: TransactionSettings): void;

22
node_modules/kysely/dist/cjs/driver/driver.js generated vendored Normal file
View File

@@ -0,0 +1,22 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.TRANSACTION_ISOLATION_LEVELS = exports.TRANSACTION_ACCESS_MODES = void 0;
exports.validateTransactionSettings = validateTransactionSettings;
exports.TRANSACTION_ACCESS_MODES = ['read only', 'read write'];
exports.TRANSACTION_ISOLATION_LEVELS = [
'read uncommitted',
'read committed',
'repeatable read',
'serializable',
'snapshot',
];
function validateTransactionSettings(settings) {
if (settings.accessMode &&
!exports.TRANSACTION_ACCESS_MODES.includes(settings.accessMode)) {
throw new Error(`invalid transaction access mode ${settings.accessMode}`);
}
if (settings.isolationLevel &&
!exports.TRANSACTION_ISOLATION_LEVELS.includes(settings.isolationLevel)) {
throw new Error(`invalid transaction isolation level ${settings.isolationLevel}`);
}
}

75
node_modules/kysely/dist/cjs/driver/dummy-driver.d.ts generated vendored Normal file
View File

@@ -0,0 +1,75 @@
import type { DatabaseConnection } from './database-connection.js';
import type { Driver } from './driver.js';
/**
* A driver that does absolutely nothing.
*
* You can use this to create Kysely instances solely for building queries
*
* ### Examples
*
* This example creates a Kysely instance for building postgres queries:
*
* ```ts
* import {
* DummyDriver,
* Kysely,
* PostgresAdapter,
* PostgresIntrospector,
* PostgresQueryCompiler
* } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: {
* createAdapter: () => new PostgresAdapter(),
* createDriver: () => new DummyDriver(),
* createIntrospector: (db: Kysely<any>) => new PostgresIntrospector(db),
* createQueryCompiler: () => new PostgresQueryCompiler(),
* },
* })
* ```
*
* You can use it to build a query and compile it to SQL but trying to
* execute the query will throw an error.
*
* ```ts
* const { sql } = db.selectFrom('person').selectAll().compile()
* console.log(sql) // select * from "person"
* ```
*/
export declare class DummyDriver implements Driver {
/**
* Initializes the driver.
*
* After calling this method the driver should be usable and `acquireConnection` etc.
* methods should be callable.
*/
init(): Promise<void>;
/**
* Acquires a new connection from the pool.
*/
acquireConnection(): Promise<DatabaseConnection>;
/**
* Begins a transaction.
*/
beginTransaction(): Promise<void>;
/**
* Commits a transaction.
*/
commitTransaction(): Promise<void>;
/**
* Rolls back a transaction.
*/
rollbackTransaction(): Promise<void>;
/**
* Releases a connection back to the pool.
*/
releaseConnection(): Promise<void>;
/**
* Destroys the driver and releases all resources.
*/
destroy(): Promise<void>;
releaseSavepoint(): Promise<void>;
rollbackToSavepoint(): Promise<void>;
savepoint(): Promise<void>;
}

83
node_modules/kysely/dist/cjs/driver/dummy-driver.js generated vendored Normal file
View File

@@ -0,0 +1,83 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.DummyDriver = void 0;
/**
* A driver that does absolutely nothing.
*
* You can use this to create Kysely instances solely for building queries
*
* ### Examples
*
* This example creates a Kysely instance for building postgres queries:
*
* ```ts
* import {
* DummyDriver,
* Kysely,
* PostgresAdapter,
* PostgresIntrospector,
* PostgresQueryCompiler
* } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: {
* createAdapter: () => new PostgresAdapter(),
* createDriver: () => new DummyDriver(),
* createIntrospector: (db: Kysely<any>) => new PostgresIntrospector(db),
* createQueryCompiler: () => new PostgresQueryCompiler(),
* },
* })
* ```
*
* You can use it to build a query and compile it to SQL but trying to
* execute the query will throw an error.
*
* ```ts
* const { sql } = db.selectFrom('person').selectAll().compile()
* console.log(sql) // select * from "person"
* ```
*/
class DummyDriver {
async init() {
// Nothing to do here.
}
async acquireConnection() {
return new DummyConnection();
}
async beginTransaction() {
// Nothing to do here.
}
async commitTransaction() {
// Nothing to do here.
}
async rollbackTransaction() {
// Nothing to do here.
}
async releaseConnection() {
// Nothing to do here.
}
async destroy() {
// Nothing to do here.
}
async releaseSavepoint() {
// Nothing to do here.
}
async rollbackToSavepoint() {
// Nothing to do here.
}
async savepoint() {
// Nothing to do here.
}
}
exports.DummyDriver = DummyDriver;
class DummyConnection {
async executeQuery() {
return {
rows: [],
};
}
async *streamQuery() {
// Nothing to do here.
}
}

View File

@@ -0,0 +1,47 @@
import type { QueryCompiler } from '../query-compiler/query-compiler.js';
import type { Log } from '../util/log.js';
import type { DatabaseConnection } from './database-connection.js';
import type { Driver, TransactionSettings } from './driver.js';
/**
* A small wrapper around {@link Driver} that makes sure the driver is
* initialized before it is used, only initialized and destroyed
* once etc.
*/
export declare class RuntimeDriver implements Driver {
#private;
constructor(driver: Driver, log: Log);
/**
* Initializes the driver.
*
* After calling this method the driver should be usable and `acquireConnection` etc.
* methods should be callable.
*/
init(): Promise<void>;
/**
* Acquires a new connection from the pool.
*/
acquireConnection(): Promise<DatabaseConnection>;
/**
* Releases a connection back to the pool.
*/
releaseConnection(connection: DatabaseConnection): Promise<void>;
/**
* Begins a transaction.
*/
beginTransaction(connection: DatabaseConnection, settings: TransactionSettings): Promise<void>;
/**
* Commits a transaction.
*/
commitTransaction(connection: DatabaseConnection): Promise<void>;
/**
* Rolls back a transaction.
*/
rollbackTransaction(connection: DatabaseConnection): Promise<void>;
savepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
rollbackToSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
releaseSavepoint(connection: DatabaseConnection, savepointName: string, compileQuery: QueryCompiler['compileQuery']): Promise<void>;
/**
* Destroys the driver and releases all resources.
*/
destroy(): Promise<void>;
}

165
node_modules/kysely/dist/cjs/driver/runtime-driver.js generated vendored Normal file
View File

@@ -0,0 +1,165 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.RuntimeDriver = void 0;
const performance_now_js_1 = require("../util/performance-now.js");
/**
* A small wrapper around {@link Driver} that makes sure the driver is
* initialized before it is used, only initialized and destroyed
* once etc.
*/
class RuntimeDriver {
#driver;
#log;
#initPromise;
#initDone;
#destroyPromise;
#connections = new WeakSet();
constructor(driver, log) {
this.#initDone = false;
this.#driver = driver;
this.#log = log;
}
async init() {
if (this.#destroyPromise) {
throw new Error('driver has already been destroyed');
}
if (!this.#initPromise) {
this.#initPromise = this.#driver
.init()
.then(() => {
this.#initDone = true;
})
.catch((err) => {
this.#initPromise = undefined;
return Promise.reject(err);
});
}
await this.#initPromise;
}
async acquireConnection() {
if (this.#destroyPromise) {
throw new Error('driver has already been destroyed');
}
if (!this.#initDone) {
await this.init();
}
const connection = await this.#driver.acquireConnection();
if (!this.#connections.has(connection)) {
if (this.#needsLogging()) {
this.#addLogging(connection);
}
this.#connections.add(connection);
}
return connection;
}
async releaseConnection(connection) {
await this.#driver.releaseConnection(connection);
}
beginTransaction(connection, settings) {
return this.#driver.beginTransaction(connection, settings);
}
commitTransaction(connection) {
return this.#driver.commitTransaction(connection);
}
rollbackTransaction(connection) {
return this.#driver.rollbackTransaction(connection);
}
savepoint(connection, savepointName, compileQuery) {
if (this.#driver.savepoint) {
return this.#driver.savepoint(connection, savepointName, compileQuery);
}
throw new Error('The `savepoint` method is not supported by this driver');
}
rollbackToSavepoint(connection, savepointName, compileQuery) {
if (this.#driver.rollbackToSavepoint) {
return this.#driver.rollbackToSavepoint(connection, savepointName, compileQuery);
}
throw new Error('The `rollbackToSavepoint` method is not supported by this driver');
}
releaseSavepoint(connection, savepointName, compileQuery) {
if (this.#driver.releaseSavepoint) {
return this.#driver.releaseSavepoint(connection, savepointName, compileQuery);
}
throw new Error('The `releaseSavepoint` method is not supported by this driver');
}
async destroy() {
if (!this.#initPromise) {
return;
}
await this.#initPromise;
if (!this.#destroyPromise) {
this.#destroyPromise = this.#driver.destroy().catch((err) => {
this.#destroyPromise = undefined;
return Promise.reject(err);
});
}
await this.#destroyPromise;
}
#needsLogging() {
return (this.#log.isLevelEnabled('query') || this.#log.isLevelEnabled('error'));
}
// This method monkey patches the database connection's executeQuery method
// by adding logging code around it. Monkey patching is not pretty, but it's
// the best option in this case.
#addLogging(connection) {
const executeQuery = connection.executeQuery;
const streamQuery = connection.streamQuery;
const dis = this;
connection.executeQuery = async (compiledQuery) => {
let caughtError;
const startTime = (0, performance_now_js_1.performanceNow)();
try {
return await executeQuery.call(connection, compiledQuery);
}
catch (error) {
caughtError = error;
await dis.#logError(error, compiledQuery, startTime);
throw error;
}
finally {
if (!caughtError) {
await dis.#logQuery(compiledQuery, startTime);
}
}
};
connection.streamQuery = async function* (compiledQuery, chunkSize) {
let caughtError;
const startTime = (0, performance_now_js_1.performanceNow)();
try {
for await (const result of streamQuery.call(connection, compiledQuery, chunkSize)) {
yield result;
}
}
catch (error) {
caughtError = error;
await dis.#logError(error, compiledQuery, startTime);
throw error;
}
finally {
if (!caughtError) {
await dis.#logQuery(compiledQuery, startTime, true);
}
}
};
}
async #logError(error, compiledQuery, startTime) {
await this.#log.error(() => ({
level: 'error',
error,
query: compiledQuery,
queryDurationMillis: this.#calculateDurationMillis(startTime),
}));
}
async #logQuery(compiledQuery, startTime, isStream = false) {
await this.#log.query(() => ({
level: 'query',
isStream,
query: compiledQuery,
queryDurationMillis: this.#calculateDurationMillis(startTime),
}));
}
#calculateDurationMillis(startTime) {
return (0, performance_now_js_1.performanceNow)() - startTime;
}
}
exports.RuntimeDriver = RuntimeDriver;

View File

@@ -0,0 +1,11 @@
import type { DatabaseConnection } from './database-connection.js';
import type { ConnectionProvider } from './connection-provider.js';
export declare class SingleConnectionProvider implements ConnectionProvider {
#private;
constructor(connection: DatabaseConnection);
/**
* Provides a connection for the callback and takes care of disposing
* the connection after the callback has been run.
*/
provideConnection<T>(consumer: (connection: DatabaseConnection) => Promise<T>): Promise<T>;
}

View File

@@ -0,0 +1,29 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.SingleConnectionProvider = void 0;
const ignoreError = () => { };
class SingleConnectionProvider {
#connection;
#runningPromise;
constructor(connection) {
this.#connection = connection;
}
async provideConnection(consumer) {
while (this.#runningPromise) {
await this.#runningPromise.catch(ignoreError);
}
// `#runningPromise` must be set to undefined before it's
// resolved or rejected. Otherwise the while loop above
// will misbehave.
this.#runningPromise = this.#run(consumer).finally(() => {
this.#runningPromise = undefined;
});
return this.#runningPromise;
}
// Run the runner in an async function to make sure it doesn't
// throw synchronous errors.
async #run(runner) {
return await runner(this.#connection);
}
}
exports.SingleConnectionProvider = SingleConnectionProvider;

View File

@@ -0,0 +1,18 @@
import { type OperationNodeSource } from '../operation-node/operation-node-source.js';
import type { SimpleReferenceExpressionNode } from '../operation-node/simple-reference-expression-node.js';
export declare class DynamicReferenceBuilder<R extends string = never> implements OperationNodeSource {
#private;
get dynamicReference(): string;
/**
* @private
*
* This needs to be here just so that the typings work. Without this
* the generated .d.ts file contains no reference to the type param R
* which causes this type to be equal to DynamicReferenceBuilder with
* any R.
*/
protected get refType(): R;
constructor(reference: string);
toOperationNode(): SimpleReferenceExpressionNode;
}
export declare function isDynamicReferenceBuilder(obj: unknown): obj is DynamicReferenceBuilder<any>;

View File

@@ -0,0 +1,36 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.DynamicReferenceBuilder = void 0;
exports.isDynamicReferenceBuilder = isDynamicReferenceBuilder;
const operation_node_source_js_1 = require("../operation-node/operation-node-source.js");
const reference_parser_js_1 = require("../parser/reference-parser.js");
const object_utils_js_1 = require("../util/object-utils.js");
class DynamicReferenceBuilder {
#dynamicReference;
get dynamicReference() {
return this.#dynamicReference;
}
/**
* @private
*
* This needs to be here just so that the typings work. Without this
* the generated .d.ts file contains no reference to the type param R
* which causes this type to be equal to DynamicReferenceBuilder with
* any R.
*/
get refType() {
return undefined;
}
constructor(reference) {
this.#dynamicReference = reference;
}
toOperationNode() {
return (0, reference_parser_js_1.parseSimpleReferenceExpression)(this.#dynamicReference);
}
}
exports.DynamicReferenceBuilder = DynamicReferenceBuilder;
function isDynamicReferenceBuilder(obj) {
return ((0, object_utils_js_1.isObject)(obj) &&
(0, operation_node_source_js_1.isOperationNodeSource)(obj) &&
(0, object_utils_js_1.isString)(obj.dynamicReference));
}

View File

@@ -0,0 +1,16 @@
import { AliasNode } from '../operation-node/alias-node.js';
import { type OperationNodeSource } from '../operation-node/operation-node-source.js';
export declare class DynamicTableBuilder<T extends string> {
#private;
get table(): T;
constructor(table: T);
as<A extends string>(alias: A): AliasedDynamicTableBuilder<T, A>;
}
export declare class AliasedDynamicTableBuilder<T extends string, A extends string> implements OperationNodeSource {
#private;
get table(): T;
get alias(): A;
constructor(table: T, alias: A);
toOperationNode(): AliasNode;
}
export declare function isAliasedDynamicTableBuilder(obj: unknown): obj is AliasedDynamicTableBuilder<any, any>;

View File

@@ -0,0 +1,46 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.AliasedDynamicTableBuilder = exports.DynamicTableBuilder = void 0;
exports.isAliasedDynamicTableBuilder = isAliasedDynamicTableBuilder;
const alias_node_js_1 = require("../operation-node/alias-node.js");
const identifier_node_js_1 = require("../operation-node/identifier-node.js");
const operation_node_source_js_1 = require("../operation-node/operation-node-source.js");
const table_parser_js_1 = require("../parser/table-parser.js");
const object_utils_js_1 = require("../util/object-utils.js");
class DynamicTableBuilder {
#table;
get table() {
return this.#table;
}
constructor(table) {
this.#table = table;
}
as(alias) {
return new AliasedDynamicTableBuilder(this.#table, alias);
}
}
exports.DynamicTableBuilder = DynamicTableBuilder;
class AliasedDynamicTableBuilder {
#table;
#alias;
get table() {
return this.#table;
}
get alias() {
return this.#alias;
}
constructor(table, alias) {
this.#table = table;
this.#alias = alias;
}
toOperationNode() {
return alias_node_js_1.AliasNode.create((0, table_parser_js_1.parseTable)(this.#table), identifier_node_js_1.IdentifierNode.create(this.#alias));
}
}
exports.AliasedDynamicTableBuilder = AliasedDynamicTableBuilder;
function isAliasedDynamicTableBuilder(obj) {
return ((0, object_utils_js_1.isObject)(obj) &&
(0, operation_node_source_js_1.isOperationNodeSource)(obj) &&
(0, object_utils_js_1.isString)(obj.table) &&
(0, object_utils_js_1.isString)(obj.alias));
}

124
node_modules/kysely/dist/cjs/dynamic/dynamic.d.ts generated vendored Normal file
View File

@@ -0,0 +1,124 @@
import { DynamicReferenceBuilder } from './dynamic-reference-builder.js';
import { DynamicTableBuilder } from './dynamic-table-builder.js';
export declare class DynamicModule<DB> {
/**
* Creates a dynamic reference to a column that is not know at compile time.
*
* Kysely is built in a way that by default you can't refer to tables or columns
* that are not actually visible in the current query and context. This is all
* done by TypeScript at compile time, which means that you need to know the
* columns and tables at compile time. This is not always the case of course.
*
* This method is meant to be used in those cases where the column names
* come from the user input or are not otherwise known at compile time.
*
* WARNING! Unlike values, column names are not escaped by the database engine
* or Kysely and if you pass in unchecked column names using this method, you
* create an SQL injection vulnerability. Always __always__ validate the user
* input before passing it to this method.
*
* There are couple of examples below for some use cases, but you can pass
* `ref` to other methods as well. If the types allow you to pass a `ref`
* value to some place, it should work.
*
* ### Examples
*
* Filter by a column not know at compile time:
*
* ```ts
* async function someQuery(filterColumn: string, filterValue: string) {
* const { ref } = db.dynamic
*
* return await db
* .selectFrom('person')
* .selectAll()
* .where(ref(filterColumn), '=', filterValue)
* .execute()
* }
*
* someQuery('first_name', 'Arnold')
* someQuery('person.last_name', 'Aniston')
* ```
*
* Order by a column not know at compile time:
*
* ```ts
* async function someQuery(orderBy: string) {
* const { ref } = db.dynamic
*
* return await db
* .selectFrom('person')
* .select('person.first_name as fn')
* .orderBy(ref(orderBy))
* .execute()
* }
*
* someQuery('fn')
* ```
*
* In this example we add selections dynamically:
*
* ```ts
* const { ref } = db.dynamic
*
* // Some column name provided by the user. Value not known at compile time.
* const columnFromUserInput: PossibleColumns = 'birthdate';
*
* // A type that lists all possible values `columnFromUserInput` can have.
* // You can use `keyof Person` if any column of an interface is allowed.
* type PossibleColumns = 'last_name' | 'first_name' | 'birthdate'
*
* const [person] = await db.selectFrom('person')
* .select([
* ref<PossibleColumns>(columnFromUserInput),
* 'id'
* ])
* .execute()
*
* // The resulting type contains all `PossibleColumns` as optional fields
* // because we cannot know which field was actually selected before
* // running the code.
* const lastName: string | null | undefined = person?.last_name
* const firstName: string | undefined = person?.first_name
* const birthDate: Date | null | undefined = person?.birthdate
*
* // The result type also contains the compile time selection `id`.
* person?.id
* ```
*/
ref<R extends string = never>(reference: string): DynamicReferenceBuilder<R>;
/**
* Creates a table reference to a table that's not fully known at compile time.
*
* The type `T` is allowed to be a union of multiple tables.
*
* <!-- siteExample("select", "Generic find query", 130) -->
*
* A generic type-safe helper function for finding a row by a column value:
*
* ```ts
* import { SelectType } from 'kysely'
* import { Database } from 'type-editor'
*
* async function getRowByColumn<
* T extends keyof Database,
* C extends keyof Database[T] & string,
* V extends SelectType<Database[T][C]>,
* >(t: T, c: C, v: V) {
* // We need to use the dynamic module since the table name
* // is not known at compile time.
* const { table, ref } = db.dynamic
*
* return await db
* .selectFrom(table(t).as('t'))
* .selectAll()
* .where(ref(c), '=', v)
* .orderBy('t.id')
* .executeTakeFirstOrThrow()
* }
*
* const person = await getRowByColumn('person', 'first_name', 'Arnold')
* ```
*/
table<T extends keyof DB & string>(table: T): DynamicTableBuilder<T>;
}

132
node_modules/kysely/dist/cjs/dynamic/dynamic.js generated vendored Normal file
View File

@@ -0,0 +1,132 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.DynamicModule = void 0;
const dynamic_reference_builder_js_1 = require("./dynamic-reference-builder.js");
const dynamic_table_builder_js_1 = require("./dynamic-table-builder.js");
class DynamicModule {
/**
* Creates a dynamic reference to a column that is not know at compile time.
*
* Kysely is built in a way that by default you can't refer to tables or columns
* that are not actually visible in the current query and context. This is all
* done by TypeScript at compile time, which means that you need to know the
* columns and tables at compile time. This is not always the case of course.
*
* This method is meant to be used in those cases where the column names
* come from the user input or are not otherwise known at compile time.
*
* WARNING! Unlike values, column names are not escaped by the database engine
* or Kysely and if you pass in unchecked column names using this method, you
* create an SQL injection vulnerability. Always __always__ validate the user
* input before passing it to this method.
*
* There are couple of examples below for some use cases, but you can pass
* `ref` to other methods as well. If the types allow you to pass a `ref`
* value to some place, it should work.
*
* ### Examples
*
* Filter by a column not know at compile time:
*
* ```ts
* async function someQuery(filterColumn: string, filterValue: string) {
* const { ref } = db.dynamic
*
* return await db
* .selectFrom('person')
* .selectAll()
* .where(ref(filterColumn), '=', filterValue)
* .execute()
* }
*
* someQuery('first_name', 'Arnold')
* someQuery('person.last_name', 'Aniston')
* ```
*
* Order by a column not know at compile time:
*
* ```ts
* async function someQuery(orderBy: string) {
* const { ref } = db.dynamic
*
* return await db
* .selectFrom('person')
* .select('person.first_name as fn')
* .orderBy(ref(orderBy))
* .execute()
* }
*
* someQuery('fn')
* ```
*
* In this example we add selections dynamically:
*
* ```ts
* const { ref } = db.dynamic
*
* // Some column name provided by the user. Value not known at compile time.
* const columnFromUserInput: PossibleColumns = 'birthdate';
*
* // A type that lists all possible values `columnFromUserInput` can have.
* // You can use `keyof Person` if any column of an interface is allowed.
* type PossibleColumns = 'last_name' | 'first_name' | 'birthdate'
*
* const [person] = await db.selectFrom('person')
* .select([
* ref<PossibleColumns>(columnFromUserInput),
* 'id'
* ])
* .execute()
*
* // The resulting type contains all `PossibleColumns` as optional fields
* // because we cannot know which field was actually selected before
* // running the code.
* const lastName: string | null | undefined = person?.last_name
* const firstName: string | undefined = person?.first_name
* const birthDate: Date | null | undefined = person?.birthdate
*
* // The result type also contains the compile time selection `id`.
* person?.id
* ```
*/
ref(reference) {
return new dynamic_reference_builder_js_1.DynamicReferenceBuilder(reference);
}
/**
* Creates a table reference to a table that's not fully known at compile time.
*
* The type `T` is allowed to be a union of multiple tables.
*
* <!-- siteExample("select", "Generic find query", 130) -->
*
* A generic type-safe helper function for finding a row by a column value:
*
* ```ts
* import { SelectType } from 'kysely'
* import { Database } from 'type-editor'
*
* async function getRowByColumn<
* T extends keyof Database,
* C extends keyof Database[T] & string,
* V extends SelectType<Database[T][C]>,
* >(t: T, c: C, v: V) {
* // We need to use the dynamic module since the table name
* // is not known at compile time.
* const { table, ref } = db.dynamic
*
* return await db
* .selectFrom(table(t).as('t'))
* .selectAll()
* .where(ref(c), '=', v)
* .orderBy('t.id')
* .executeTakeFirstOrThrow()
* }
*
* const person = await getRowByColumn('person', 'first_name', 'Arnold')
* ```
*/
table(table) {
return new dynamic_table_builder_js_1.DynamicTableBuilder(table);
}
}
exports.DynamicModule = DynamicModule;

View File

@@ -0,0 +1,920 @@
import { type SelectQueryBuilder } from '../query-builder/select-query-builder.js';
import { type TableExpressionOrList } from '../parser/table-parser.js';
import { type FunctionModule } from '../query-builder/function-module.js';
import { type ExtractTypeFromReferenceExpression, type ReferenceExpression, type SimpleReferenceExpression, type StringReference } from '../parser/reference-parser.js';
import type { QueryExecutor } from '../query-executor/query-executor.js';
import { type BinaryOperatorExpression, type FilterObject, type OperandValueExpression, type OperandValueExpressionOrList } from '../parser/binary-operation-parser.js';
import type { Expression } from './expression.js';
import { ExpressionWrapper } from './expression-wrapper.js';
import { type ComparisonOperator, type JSONOperatorWith$, type UnaryOperator } from '../operation-node/operator-node.js';
import type { IsNever, SqlBool } from '../util/type-utils.js';
import { type ExtractTypeFromValueExpression } from '../parser/value-parser.js';
import { CaseBuilder } from '../query-builder/case-builder.js';
import { JSONPathBuilder } from '../query-builder/json-path-builder.js';
import type { OperandExpression } from '../parser/expression-parser.js';
import type { RefTuple2, RefTuple3, RefTuple4, RefTuple5, ValTuple2, ValTuple3, ValTuple4, ValTuple5 } from '../parser/tuple-parser.js';
import type { Selectable } from '../util/column-type.js';
import type { KyselyTypeError } from '../util/type-error.js';
import { type DataTypeExpression } from '../parser/data-type-parser.js';
import type { SelectFrom } from '../parser/select-from-parser.js';
export interface ExpressionBuilder<DB, TB extends keyof DB> {
/**
* Creates a binary expression.
*
* This function returns an {@link Expression} and can be used pretty much anywhere.
* See the examples for a couple of possible use cases.
*
* ### Examples
*
* A simple comparison:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where((eb) => eb('first_name', '=', 'Jennifer'))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where "first_name" = $1
* ```
*
* By default the third argument is interpreted as a value. To pass in
* a column reference, you can use {@link ref}:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where((eb) => eb('first_name', '=', eb.ref('last_name')))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where "first_name" = "last_name"
* ```
*
* In the following example `eb` is used to increment an integer column:
*
* ```ts
* await db.updateTable('person')
* .set((eb) => ({
* age: eb('age', '+', 1)
* }))
* .where('id', '=', 3)
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* update "person"
* set "age" = "age" + $1
* where "id" = $2
* ```
*
* As always, expressions can be nested. Both the first and the third argument
* can be any expression:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where((eb) => eb(
* eb.fn<string>('lower', ['first_name']),
* 'in',
* eb.selectFrom('pet')
* .select('pet.name')
* .where('pet.species', '=', 'cat')
* ))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where lower("first_name") in (
* select "pet"."name"
* from "pet"
* where "pet"."species" = $1
* )
* ```
*/
<RE extends ReferenceExpression<DB, TB>, OP extends BinaryOperatorExpression, VE extends OperandValueExpressionOrList<DB, TB, RE>>(lhs: RE, op: OP, rhs: VE): ExpressionWrapper<DB, TB, OP extends ComparisonOperator ? SqlBool : OP extends Expression<infer T> ? unknown extends T ? SqlBool : T : ExtractTypeFromReferenceExpression<DB, TB, RE>>;
/**
* Returns a copy of `this` expression builder, for destructuring purposes.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .where(({ eb, exists, selectFrom }) =>
* eb('first_name', '=', 'Jennifer').and(exists(
* selectFrom('pet').whereRef('owner_id', '=', 'person.id').select('pet.id')
* ))
* )
* .selectAll()
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select * from "person" where "first_name" = $1 and exists (
* select "pet.id" from "pet" where "owner_id" = "person.id"
* )
* ```
*/
get eb(): ExpressionBuilder<DB, TB>;
/**
* Returns a {@link FunctionModule} that can be used to write type safe function
* calls.
*
* The difference between this and {@link Kysely.fn} is that this one is more
* type safe. You can only refer to columns visible to the part of the query
* you are building. {@link Kysely.fn} allows you to refer to columns in any
* table of the database even if it doesn't produce valid SQL.
*
* ```ts
* const result = await db.selectFrom('person')
* .innerJoin('pet', 'pet.owner_id', 'person.id')
* .select((eb) => [
* 'person.id',
* eb.fn.count('pet.id').as('pet_count')
* ])
* .groupBy('person.id')
* .having((eb) => eb.fn.count('pet.id'), '>', 10)
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person"."id", count("pet"."id") as "pet_count"
* from "person"
* inner join "pet" on "pet"."owner_id" = "person"."id"
* group by "person"."id"
* having count("pet"."id") > $1
* ```
*/
get fn(): FunctionModule<DB, TB>;
/**
* Creates a subquery.
*
* The query builder returned by this method is typed in a way that you can refer to
* all tables of the parent query in addition to the subquery's tables.
*
* This method accepts all the same inputs as {@link QueryCreator.selectFrom}.
*
* ### Examples
*
* This example shows that you can refer to both `pet.owner_id` and `person.id`
* columns from the subquery. This is needed to be able to create correlated
* subqueries:
*
* ```ts
* const result = await db.selectFrom('pet')
* .select((eb) => [
* 'pet.name',
* eb.selectFrom('person')
* .whereRef('person.id', '=', 'pet.owner_id')
* .select('person.first_name')
* .as('owner_name')
* ])
* .execute()
*
* console.log(result[0]?.owner_name)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* "pet"."name",
* ( select "person"."first_name"
* from "person"
* where "person"."id" = "pet"."owner_id"
* ) as "owner_name"
* from "pet"
* ```
*
* You can use a normal query in place of `(qb) => qb.selectFrom(...)` but in
* that case Kysely typings wouldn't allow you to reference `pet.owner_id`
* because `pet` is not joined to that query.
*/
selectFrom<TE extends TableExpressionOrList<DB, TB>>(from: TE): SelectFrom<DB, TB, TE>;
/**
* Creates a `case` statement/operator.
*
* ### Examples
*
* Kitchen sink example with 2 flavors of `case` operator:
*
* ```ts
* const { title, name } = await db
* .selectFrom('person')
* .where('id', '=', 123)
* .select((eb) => [
* eb.fn.coalesce('last_name', 'first_name').as('name'),
* eb
* .case()
* .when('gender', '=', 'male')
* .then('Mr.')
* .when('gender', '=', 'female')
* .then(
* eb
* .case('marital_status')
* .when('single')
* .then('Ms.')
* .else('Mrs.')
* .end()
* )
* .end()
* .as('title'),
* ])
* .executeTakeFirstOrThrow()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* coalesce("last_name", "first_name") as "name",
* case
* when "gender" = $1 then $2
* when "gender" = $3 then
* case "marital_status"
* when $4 then $5
* else $6
* end
* end as "title"
* from "person"
* where "id" = $7
* ```
*/
case(): CaseBuilder<DB, TB>;
case<C extends SimpleReferenceExpression<DB, TB>>(column: C): CaseBuilder<DB, TB, ExtractTypeFromReferenceExpression<DB, TB, C>>;
case<E extends Expression<any>>(expression: E): CaseBuilder<DB, TB, ExtractTypeFromValueExpression<E>>;
/**
* This method can be used to reference columns within the query's context. For
* a non-type-safe version of this method see {@link sql}'s version.
*
* Additionally, this method can be used to reference nested JSON properties or
* array elements. See {@link JSONPathBuilder} for more information. For regular
* JSON path expressions you can use {@link jsonPath}.
*
* ### Examples
*
* By default the third argument of binary expressions is a value.
* This function can be used to pass in a column reference instead:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where((eb) => eb.or([
* eb('first_name', '=', eb.ref('last_name')),
* eb('first_name', '=', eb.ref('middle_name'))
* ]))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person".*
* from "person"
* where "first_name" = "last_name" or "first_name" = "middle_name"
* ```
*
* In the next example we use the `ref` method to reference columns of the virtual
* table `excluded` in a type-safe way to create an upsert operation:
*
* ```ts
* await db.insertInto('person')
* .values({
* id: 3,
* first_name: 'Jennifer',
* last_name: 'Aniston',
* gender: 'female',
* })
* .onConflict((oc) => oc
* .column('id')
* .doUpdateSet(({ ref }) => ({
* first_name: ref('excluded.first_name'),
* last_name: ref('excluded.last_name'),
* gender: ref('excluded.gender'),
* }))
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* insert into "person" ("id", "first_name", "last_name", "gender")
* values ($1, $2, $3, $4)
* on conflict ("id") do update set
* "first_name" = "excluded"."first_name",
* "last_name" = "excluded"."last_name",
* "gender" = "excluded"."gender"
* ```
*
* In the next example we use `ref` in a raw sql expression. Unless you want
* to be as type-safe as possible, this is probably overkill:
*
* ```ts
* import { sql } from 'kysely'
*
* await db.updateTable('pet')
* .set((eb) => ({
* name: sql<string>`concat(${eb.ref('pet.name')}, ${' the animal'})`
* }))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* update "pet" set "name" = concat("pet"."name", $1)
* ```
*
* In the next example we use `ref` to reference a nested JSON property:
*
* ```ts
* const result = await db.selectFrom('person')
* .where(({ eb, ref }) => eb(
* ref('profile', '->').key('addresses').at(0).key('city'),
* '=',
* 'San Diego'
* ))
* .selectAll()
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select * from "person" where "profile"->'addresses'->0->'city' = $1
* ```
*
* You can also compile to a JSON path expression by using the `->$`or `->>$` operator:
*
* ```ts
* const result = await db.selectFrom('person')
* .select(({ ref }) =>
* ref('profile', '->$')
* .key('addresses')
* .at('last')
* .key('city')
* .as('current_city')
* )
* .execute()
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* select `profile`->'$.addresses[last].city' as `current_city` from `person`
* ```
*/
ref<RE extends StringReference<DB, TB>>(reference: RE): ExpressionWrapper<DB, TB, ExtractTypeFromReferenceExpression<DB, TB, RE>>;
ref<RE extends StringReference<DB, TB>>(reference: RE, op: JSONOperatorWith$): JSONPathBuilder<ExtractTypeFromReferenceExpression<DB, TB, RE>>;
/**
* Creates a JSON path expression with provided column as root document (the $).
*
* For a JSON reference expression, see {@link ref}.
*
* ### Examples
*
* ```ts
* await db.updateTable('person')
* .set('profile', (eb) => eb.fn('json_set', [
* 'profile',
* eb.jsonPath<'profile'>().key('addresses').at('last').key('city'),
* eb.val('San Diego')
* ]))
* .where('id', '=', 3)
* .execute()
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* update `person`
* set `profile` = json_set(`profile`, '$.addresses[last].city', $1)
* where `id` = $2
* ```
*/
jsonPath<$ extends StringReference<DB, TB> = never>(): IsNever<$> extends true ? KyselyTypeError<"You must provide a column reference as this method's $ generic"> : JSONPathBuilder<ExtractTypeFromReferenceExpression<DB, TB, $>>;
/**
* Creates a table reference.
*
* ### Examples
*
* ```ts
* import { sql } from 'kysely'
* import type { Pet } from 'type-editor' // imaginary module
*
* const result = await db.selectFrom('person')
* .innerJoin('pet', 'pet.owner_id', 'person.id')
* .select(eb => [
* 'person.id',
* sql<Pet[]>`jsonb_agg(${eb.table('pet')})`.as('pets')
* ])
* .groupBy('person.id')
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person"."id", jsonb_agg("pet") as "pets"
* from "person"
* inner join "pet" on "pet"."owner_id" = "person"."id"
* group by "person"."id"
* ```
*
* If you need a column reference, use {@link ref}.
*/
table<T extends TB & string>(table: T): ExpressionWrapper<DB, TB, Selectable<DB[T]>>;
/**
* Returns a value expression.
*
* This can be used to pass in a value where a reference is taken by default.
*
* This function returns an {@link Expression} and can be used pretty much anywhere.
*
* ### Examples
*
* Binary expressions take a reference by default as the first argument. `val` could
* be used to pass in a value instead:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where((eb) => eb(
* eb.val('cat'),
* '=',
* eb.fn.any(
* eb.selectFrom('pet')
* .select('species')
* .whereRef('owner_id', '=', 'person.id')
* )
* ))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where $1 = any(
* select "species"
* from "pet"
* where "owner_id" = "person"."id"
* )
* ```
*/
val<VE>(value: VE): ExpressionWrapper<DB, TB, ExtractTypeFromValueExpression<VE>>;
/**
* Creates a tuple expression.
*
* This creates a tuple using column references by default. See {@link tuple}
* if you need to create value tuples.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where(({ eb, refTuple, tuple }) => eb(
* refTuple('first_name', 'last_name'),
* 'in',
* [
* tuple('Jennifer', 'Aniston'),
* tuple('Sylvester', 'Stallone')
* ]
* ))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* "person".*
* from
* "person"
* where
* ("first_name", "last_name")
* in
* (
* ($1, $2),
* ($3, $4)
* )
* ```
*
* In the next example a reference tuple is compared to a subquery. Note that
* in this case you need to use the {@link @SelectQueryBuilder.$asTuple | $asTuple}
* function:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where(({ eb, refTuple, selectFrom }) => eb(
* refTuple('first_name', 'last_name'),
* 'in',
* selectFrom('pet')
* .select(['name', 'species'])
* .where('species', '!=', 'cat')
* .$asTuple('name', 'species')
* ))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* "person".*
* from
* "person"
* where
* ("first_name", "last_name")
* in
* (
* select "name", "species"
* from "pet"
* where "species" != $1
* )
* ```
*/
refTuple<R1 extends ReferenceExpression<DB, TB>, R2 extends ReferenceExpression<DB, TB>>(value1: R1, value2: R2): ExpressionWrapper<DB, TB, RefTuple2<DB, TB, R1, R2>>;
refTuple<R1 extends ReferenceExpression<DB, TB>, R2 extends ReferenceExpression<DB, TB>, R3 extends ReferenceExpression<DB, TB>>(value1: R1, value2: R2, value3: R3): ExpressionWrapper<DB, TB, RefTuple3<DB, TB, R1, R2, R3>>;
refTuple<R1 extends ReferenceExpression<DB, TB>, R2 extends ReferenceExpression<DB, TB>, R3 extends ReferenceExpression<DB, TB>, R4 extends ReferenceExpression<DB, TB>>(value1: R1, value2: R2, value3: R3, value4: R4): ExpressionWrapper<DB, TB, RefTuple4<DB, TB, R1, R2, R3, R4>>;
refTuple<R1 extends ReferenceExpression<DB, TB>, R2 extends ReferenceExpression<DB, TB>, R3 extends ReferenceExpression<DB, TB>, R4 extends ReferenceExpression<DB, TB>, R5 extends ReferenceExpression<DB, TB>>(value1: R1, value2: R2, value3: R3, value4: R4, value5: R5): ExpressionWrapper<DB, TB, RefTuple5<DB, TB, R1, R2, R3, R4, R5>>;
/**
* Creates a value tuple expression.
*
* This creates a tuple using values by default. See {@link refTuple} if you need to create
* tuples using column references.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where(({ eb, refTuple, tuple }) => eb(
* refTuple('first_name', 'last_name'),
* 'in',
* [
* tuple('Jennifer', 'Aniston'),
* tuple('Sylvester', 'Stallone')
* ]
* ))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* "person".*
* from
* "person"
* where
* ("first_name", "last_name")
* in
* (
* ($1, $2),
* ($3, $4)
* )
* ```
*/
tuple<V1, V2>(value1: V1, value2: V2): ExpressionWrapper<DB, TB, ValTuple2<V1, V2>>;
tuple<V1, V2, V3>(value1: V1, value2: V2, value3: V3): ExpressionWrapper<DB, TB, ValTuple3<V1, V2, V3>>;
tuple<V1, V2, V3, V4>(value1: V1, value2: V2, value3: V3, value4: V4): ExpressionWrapper<DB, TB, ValTuple4<V1, V2, V3, V4>>;
tuple<V1, V2, V3, V4, V5>(value1: V1, value2: V2, value3: V3, value4: V4, value5: V5): ExpressionWrapper<DB, TB, ValTuple5<V1, V2, V3, V4, V5>>;
/**
* Returns a literal value expression.
*
* Just like `val` but creates a literal value that gets merged in the SQL.
* To prevent SQL injections, only `boolean`, `number` and `null` values
* are accepted. If you need `string` or other literals, use `sql.lit` instead.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .select((eb) => eb.lit(1).as('one'))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select 1 as "one" from "person"
* ```
*/
lit<VE extends number | boolean | null>(literal: VE): ExpressionWrapper<DB, TB, VE>;
/**
* Creates an unary expression.
*
* This function returns an {@link Expression} and can be used pretty much anywhere.
* See the examples for a couple of possible use cases.
*
* @see {@link not}, {@link exists} and {@link neg}.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .select((eb) => [
* 'first_name',
* eb.unary('-', 'age').as('negative_age')
* ])
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "first_name", -"age"
* from "person"
* ```
*/
unary<RE extends ReferenceExpression<DB, TB>>(op: UnaryOperator, expr: RE): ExpressionWrapper<DB, TB, ExtractTypeFromReferenceExpression<DB, TB, RE>>;
/**
* Creates a `not` operation.
*
* A shortcut for `unary('not', expr)`.
*
* @see {@link unary}
*/
not<RE extends ReferenceExpression<DB, TB>>(expr: RE): ExpressionWrapper<DB, TB, ExtractTypeFromReferenceExpression<DB, TB, RE>>;
/**
* Creates an `exists` operation.
*
* A shortcut for `unary('exists', expr)`.
*
* @see {@link unary}
*/
exists<RE extends ReferenceExpression<DB, TB>>(expr: RE): ExpressionWrapper<DB, TB, SqlBool>;
/**
* Creates a negation operation.
*
* A shortcut for `unary('-', expr)`.
*
* @see {@link unary}
*/
neg<RE extends ReferenceExpression<DB, TB>>(expr: RE): ExpressionWrapper<DB, TB, ExtractTypeFromReferenceExpression<DB, TB, RE>>;
/**
* Creates a `between` expression.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where((eb) => eb.between('age', 40, 60))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select * from "person" where "age" between $1 and $2
* ```
*/
between<RE extends ReferenceExpression<DB, TB>, SE extends OperandValueExpression<DB, TB, RE>, EE extends OperandValueExpression<DB, TB, RE>>(expr: RE, start: SE, end: EE): ExpressionWrapper<DB, TB, SqlBool>;
/**
* Creates a `between symmetric` expression.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where((eb) => eb.betweenSymmetric('age', 40, 60))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select * from "person" where "age" between symmetric $1 and $2
* ```
*/
betweenSymmetric<RE extends ReferenceExpression<DB, TB>, SE extends OperandValueExpression<DB, TB, RE>, EE extends OperandValueExpression<DB, TB, RE>>(expr: RE, start: SE, end: EE): ExpressionWrapper<DB, TB, SqlBool>;
/**
* Combines two or more expressions using the logical `and` operator.
*
* An empty array produces a `true` expression.
*
* This function returns an {@link Expression} and can be used pretty much anywhere.
* See the examples for a couple of possible use cases.
*
* ### Examples
*
* In this example we use `and` to create a `WHERE expr1 AND expr2 AND expr3`
* statement:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where((eb) => eb.and([
* eb('first_name', '=', 'Jennifer'),
* eb('first_name', '=', 'Arnold'),
* eb('first_name', '=', 'Sylvester')
* ]))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person".*
* from "person"
* where (
* "first_name" = $1
* and "first_name" = $2
* and "first_name" = $3
* )
* ```
*
* Optionally you can use the simpler object notation if you only need
* equality comparisons:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where((eb) => eb.and({
* first_name: 'Jennifer',
* last_name: 'Aniston'
* }))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person".*
* from "person"
* where (
* "first_name" = $1
* and "last_name" = $2
* )
* ```
*/
and<E extends OperandExpression<SqlBool>>(exprs: ReadonlyArray<E>): ExpressionWrapper<DB, TB, SqlBool>;
and<E extends Readonly<FilterObject<DB, TB>>>(exprs: E): ExpressionWrapper<DB, TB, SqlBool>;
/**
* Combines two or more expressions using the logical `or` operator.
*
* An empty array produces a `false` expression.
*
* This function returns an {@link Expression} and can be used pretty much anywhere.
* See the examples for a couple of possible use cases.
*
* ### Examples
*
* In this example we use `or` to create a `WHERE expr1 OR expr2 OR expr3`
* statement:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where((eb) => eb.or([
* eb('first_name', '=', 'Jennifer'),
* eb('first_name', '=', 'Arnold'),
* eb('first_name', '=', 'Sylvester')
* ]))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person".*
* from "person"
* where (
* "first_name" = $1
* or "first_name" = $2
* or "first_name" = $3
* )
* ```
*
* Optionally you can use the simpler object notation if you only need
* equality comparisons:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where((eb) => eb.or({
* first_name: 'Jennifer',
* last_name: 'Aniston'
* }))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person".*
* from "person"
* where (
* "first_name" = $1
* or "last_name" = $2
* )
* ```
*/
or<E extends OperandExpression<SqlBool>>(exprs: ReadonlyArray<E>): ExpressionWrapper<DB, TB, SqlBool>;
or<E extends Readonly<FilterObject<DB, TB>>>(exprs: E): ExpressionWrapper<DB, TB, SqlBool>;
/**
* Wraps the expression in parentheses.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where((eb) => eb(eb.parens('age', '+', 1), '/', 100), '<', 0.1)
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person".*
* from "person"
* where ("age" + $1) / $2 < $3
* ```
*
* You can also pass in any expression as the only argument:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll('person')
* .where((eb) => eb.parens(
* eb('age', '=', 1).or('age', '=', 2)
* ).and(
* eb('first_name', '=', 'Jennifer').or('first_name', '=', 'Arnold')
* ))
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person".*
* from "person"
* where ("age" = $1 or "age" = $2) and ("first_name" = $3 or "first_name" = $4)
* ```
*/
parens<RE extends ReferenceExpression<DB, TB>, OP extends BinaryOperatorExpression, VE extends OperandValueExpressionOrList<DB, TB, RE>>(lhs: RE, op: OP, rhs: VE): ExpressionWrapper<DB, TB, OP extends ComparisonOperator ? SqlBool : ExtractTypeFromReferenceExpression<DB, TB, RE>>;
parens<T>(expr: Expression<T>): ExpressionWrapper<DB, TB, T>;
/**
* Creates a `cast(expr as dataType)` expression.
*
* Since Kysely can't know the mapping between JavaScript and database types,
* you need to provide both explicitly.
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .select((eb) => [
* 'id',
* 'first_name',
* eb.cast<number>('age', 'integer').as('age')
* ])
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select cast("age" as integer) as "age"
* from "person"
* ```
*/
cast<T, RE extends ReferenceExpression<DB, TB> = ReferenceExpression<DB, TB>>(expr: RE, dataType: DataTypeExpression): ExpressionWrapper<DB, TB, T>;
/**
* See {@link QueryCreator.withSchema}
*
* @deprecated Will be removed in kysely 0.25.0.
*/
withSchema(schema: string): ExpressionBuilder<DB, TB>;
}
export declare function createExpressionBuilder<DB, TB extends keyof DB>(executor?: QueryExecutor): ExpressionBuilder<DB, TB>;
export declare function expressionBuilder<DB, TB extends keyof DB>(_: SelectQueryBuilder<DB, TB, any>): ExpressionBuilder<DB, TB>;
export declare function expressionBuilder<DB, TB extends keyof DB = never>(): ExpressionBuilder<DB, TB>;

View File

@@ -0,0 +1,128 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.createExpressionBuilder = createExpressionBuilder;
exports.expressionBuilder = expressionBuilder;
const select_query_builder_js_1 = require("../query-builder/select-query-builder.js");
const select_query_node_js_1 = require("../operation-node/select-query-node.js");
const table_parser_js_1 = require("../parser/table-parser.js");
const with_schema_plugin_js_1 = require("../plugin/with-schema/with-schema-plugin.js");
const query_id_js_1 = require("../util/query-id.js");
const function_module_js_1 = require("../query-builder/function-module.js");
const reference_parser_js_1 = require("../parser/reference-parser.js");
const binary_operation_parser_js_1 = require("../parser/binary-operation-parser.js");
const parens_node_js_1 = require("../operation-node/parens-node.js");
const expression_wrapper_js_1 = require("./expression-wrapper.js");
const operator_node_js_1 = require("../operation-node/operator-node.js");
const unary_operation_parser_js_1 = require("../parser/unary-operation-parser.js");
const value_parser_js_1 = require("../parser/value-parser.js");
const noop_query_executor_js_1 = require("../query-executor/noop-query-executor.js");
const case_builder_js_1 = require("../query-builder/case-builder.js");
const case_node_js_1 = require("../operation-node/case-node.js");
const object_utils_js_1 = require("../util/object-utils.js");
const json_path_builder_js_1 = require("../query-builder/json-path-builder.js");
const binary_operation_node_js_1 = require("../operation-node/binary-operation-node.js");
const and_node_js_1 = require("../operation-node/and-node.js");
const tuple_node_js_1 = require("../operation-node/tuple-node.js");
const json_path_node_js_1 = require("../operation-node/json-path-node.js");
const data_type_parser_js_1 = require("../parser/data-type-parser.js");
const cast_node_js_1 = require("../operation-node/cast-node.js");
function createExpressionBuilder(executor = noop_query_executor_js_1.NOOP_QUERY_EXECUTOR) {
function binary(lhs, op, rhs) {
return new expression_wrapper_js_1.ExpressionWrapper((0, binary_operation_parser_js_1.parseValueBinaryOperation)(lhs, op, rhs));
}
function unary(op, expr) {
return new expression_wrapper_js_1.ExpressionWrapper((0, unary_operation_parser_js_1.parseUnaryOperation)(op, expr));
}
const eb = Object.assign(binary, {
fn: undefined,
eb: undefined,
selectFrom(table) {
return (0, select_query_builder_js_1.createSelectQueryBuilder)({
queryId: (0, query_id_js_1.createQueryId)(),
executor,
queryNode: select_query_node_js_1.SelectQueryNode.createFrom((0, table_parser_js_1.parseTableExpressionOrList)(table)),
});
},
case(reference) {
return new case_builder_js_1.CaseBuilder({
node: case_node_js_1.CaseNode.create((0, object_utils_js_1.isUndefined)(reference)
? undefined
: (0, reference_parser_js_1.parseReferenceExpression)(reference)),
});
},
ref(reference, op) {
if ((0, object_utils_js_1.isUndefined)(op)) {
return new expression_wrapper_js_1.ExpressionWrapper((0, reference_parser_js_1.parseStringReference)(reference));
}
return new json_path_builder_js_1.JSONPathBuilder((0, reference_parser_js_1.parseJSONReference)(reference, op));
},
jsonPath() {
return new json_path_builder_js_1.JSONPathBuilder(json_path_node_js_1.JSONPathNode.create());
},
table(table) {
return new expression_wrapper_js_1.ExpressionWrapper((0, table_parser_js_1.parseTable)(table));
},
val(value) {
return new expression_wrapper_js_1.ExpressionWrapper((0, value_parser_js_1.parseValueExpression)(value));
},
refTuple(...values) {
return new expression_wrapper_js_1.ExpressionWrapper(tuple_node_js_1.TupleNode.create(values.map(reference_parser_js_1.parseReferenceExpression)));
},
tuple(...values) {
return new expression_wrapper_js_1.ExpressionWrapper(tuple_node_js_1.TupleNode.create(values.map(value_parser_js_1.parseValueExpression)));
},
lit(value) {
return new expression_wrapper_js_1.ExpressionWrapper((0, value_parser_js_1.parseSafeImmediateValue)(value));
},
unary,
not(expr) {
return unary('not', expr);
},
exists(expr) {
return unary('exists', expr);
},
neg(expr) {
return unary('-', expr);
},
between(expr, start, end) {
return new expression_wrapper_js_1.ExpressionWrapper(binary_operation_node_js_1.BinaryOperationNode.create((0, reference_parser_js_1.parseReferenceExpression)(expr), operator_node_js_1.OperatorNode.create('between'), and_node_js_1.AndNode.create((0, value_parser_js_1.parseValueExpression)(start), (0, value_parser_js_1.parseValueExpression)(end))));
},
betweenSymmetric(expr, start, end) {
return new expression_wrapper_js_1.ExpressionWrapper(binary_operation_node_js_1.BinaryOperationNode.create((0, reference_parser_js_1.parseReferenceExpression)(expr), operator_node_js_1.OperatorNode.create('between symmetric'), and_node_js_1.AndNode.create((0, value_parser_js_1.parseValueExpression)(start), (0, value_parser_js_1.parseValueExpression)(end))));
},
and(exprs) {
if ((0, object_utils_js_1.isReadonlyArray)(exprs)) {
return new expression_wrapper_js_1.ExpressionWrapper((0, binary_operation_parser_js_1.parseFilterList)(exprs, 'and'));
}
return new expression_wrapper_js_1.ExpressionWrapper((0, binary_operation_parser_js_1.parseFilterObject)(exprs, 'and'));
},
or(exprs) {
if ((0, object_utils_js_1.isReadonlyArray)(exprs)) {
return new expression_wrapper_js_1.ExpressionWrapper((0, binary_operation_parser_js_1.parseFilterList)(exprs, 'or'));
}
return new expression_wrapper_js_1.ExpressionWrapper((0, binary_operation_parser_js_1.parseFilterObject)(exprs, 'or'));
},
parens(...args) {
const node = (0, binary_operation_parser_js_1.parseValueBinaryOperationOrExpression)(args);
if (parens_node_js_1.ParensNode.is(node)) {
// No double wrapping.
return new expression_wrapper_js_1.ExpressionWrapper(node);
}
else {
return new expression_wrapper_js_1.ExpressionWrapper(parens_node_js_1.ParensNode.create(node));
}
},
cast(expr, dataType) {
return new expression_wrapper_js_1.ExpressionWrapper(cast_node_js_1.CastNode.create((0, reference_parser_js_1.parseReferenceExpression)(expr), (0, data_type_parser_js_1.parseDataTypeExpression)(dataType)));
},
withSchema(schema) {
return createExpressionBuilder(executor.withPluginAtFront(new with_schema_plugin_js_1.WithSchemaPlugin(schema)));
},
});
eb.fn = (0, function_module_js_1.createFunctionModule)();
eb.eb = eb;
return eb;
}
function expressionBuilder(_) {
return createExpressionBuilder();
}

View File

@@ -0,0 +1,631 @@
import { AliasNode } from '../operation-node/alias-node.js';
import { AndNode } from '../operation-node/and-node.js';
import type { OperationNode } from '../operation-node/operation-node.js';
import { OrNode } from '../operation-node/or-node.js';
import { ParensNode } from '../operation-node/parens-node.js';
import { type ComparisonOperatorExpression, type OperandValueExpressionOrList } from '../parser/binary-operation-parser.js';
import type { OperandExpression } from '../parser/expression-parser.js';
import type { ReferenceExpression } from '../parser/reference-parser.js';
import type { KyselyTypeError } from '../util/type-error.js';
import type { SqlBool } from '../util/type-utils.js';
import type { AliasableExpression, AliasedExpression, Expression } from './expression.js';
export declare class ExpressionWrapper<DB, TB extends keyof DB, T> implements AliasableExpression<T> {
#private;
constructor(node: OperationNode);
/** @private */
/**
* All expressions need to have this getter for complicated type-related reasons.
* Simply add this getter for your expression and always return `undefined` from it:
*
* ### Examples
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*
* The getter is needed to make the expression assignable to another expression only
* if the types `T` are assignable. Without this property (or some other property
* that references `T`), you could assing `Expression<string>` to `Expression<number>`.
*/
get expressionType(): T | undefined;
/**
* Returns an aliased version of the expression.
*
* ### Examples
*
* In addition to slapping `as "the_alias"` to the end of the SQL,
* this method also provides strict typing:
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select((eb) =>
* eb('first_name', '=', 'Jennifer').as('is_jennifer')
* )
* .executeTakeFirstOrThrow()
*
* // `is_jennifer: SqlBool` field exists in the result type.
* console.log(result.is_jennifer)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "first_name" = $1 as "is_jennifer"
* from "person"
* ```
*/
as<A extends string>(alias: A): AliasedExpression<T, A>;
/**
* Returns an aliased version of the expression.
*
* ### Examples
*
* In addition to slapping `as "the_alias"` at the end of the expression,
* this method also provides strict typing:
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select((eb) =>
* // `eb.fn<string>` returns an AliasableExpression<string>
* eb.fn<string>('concat', ['first_name', eb.val(' '), 'last_name']).as('full_name')
* )
* .executeTakeFirstOrThrow()
*
* // `full_name: string` field exists in the result type.
* console.log(result.full_name)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* concat("first_name", $1, "last_name") as "full_name"
* from
* "person"
* ```
*
* You can also pass in a raw SQL snippet (or any expression) but in that case you must
* provide the alias as the only type argument:
*
* ```ts
* import { sql } from 'kysely'
*
* const values = sql<{ a: number, b: string }>`(values (1, 'foo'))`
*
* // The alias is `t(a, b)` which specifies the column names
* // in addition to the table name. We must tell kysely that
* // columns of the table can be referenced through `t`
* // by providing an explicit type argument.
* const aliasedValues = values.as<'t'>(sql`t(a, b)`)
*
* await db
* .insertInto('person')
* .columns(['first_name', 'last_name'])
* .expression(
* db.selectFrom(aliasedValues).select(['t.a', 't.b'])
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* insert into "person" ("first_name", "last_name")
* from (values (1, 'foo')) as t(a, b)
* select "t"."a", "t"."b"
* ```
*/
as<A extends string>(alias: Expression<unknown>): AliasedExpression<T, A>;
/**
* Combines `this` and another expression using `OR`.
*
* Also see {@link ExpressionBuilder.or}
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where(eb => eb('first_name', '=', 'Jennifer')
* .or('first_name', '=', 'Arnold')
* .or('first_name', '=', 'Sylvester')
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where (
* "first_name" = $1
* or "first_name" = $2
* or "first_name" = $3
* )
* ```
*
* You can also pass any expression as the only argument to
* this method:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where(eb => eb('first_name', '=', 'Jennifer')
* .or(eb('first_name', '=', 'Sylvester').and('last_name', '=', 'Stallone'))
* .or(eb.exists(
* eb.selectFrom('pet')
* .select('id')
* .whereRef('pet.owner_id', '=', 'person.id')
* ))
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where (
* "first_name" = $1
* or ("first_name" = $2 and "last_name" = $3)
* or exists (
* select "id"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* )
* )
* ```
*/
or<RE extends ReferenceExpression<DB, TB>, VE extends OperandValueExpressionOrList<DB, TB, RE>>(lhs: RE, op: ComparisonOperatorExpression, rhs: VE): T extends SqlBool ? OrWrapper<DB, TB, SqlBool> : KyselyTypeError<'or() method can only be called on boolean expressions'>;
or<E extends OperandExpression<SqlBool>>(expression: E): T extends SqlBool ? OrWrapper<DB, TB, SqlBool> : KyselyTypeError<'or() method can only be called on boolean expressions'>;
/**
* Combines `this` and another expression using `AND`.
*
* Also see {@link ExpressionBuilder.and}
*
* ### Examples
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where(eb => eb('first_name', '=', 'Jennifer')
* .and('last_name', '=', 'Aniston')
* .and('age', '>', 40)
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where (
* "first_name" = $1
* and "last_name" = $2
* and "age" > $3
* )
* ```
*
* You can also pass any expression as the only argument to
* this method:
*
* ```ts
* const result = await db.selectFrom('person')
* .selectAll()
* .where(eb => eb('first_name', '=', 'Jennifer')
* .and(eb('first_name', '=', 'Sylvester').or('last_name', '=', 'Stallone'))
* .and(eb.exists(
* eb.selectFrom('pet')
* .select('id')
* .whereRef('pet.owner_id', '=', 'person.id')
* ))
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select *
* from "person"
* where (
* "first_name" = $1
* and ("first_name" = $2 or "last_name" = $3)
* and exists (
* select "id"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* )
* )
* ```
*/
and<RE extends ReferenceExpression<DB, TB>, VE extends OperandValueExpressionOrList<DB, TB, RE>>(lhs: RE, op: ComparisonOperatorExpression, rhs: VE): T extends SqlBool ? AndWrapper<DB, TB, SqlBool> : KyselyTypeError<'and() method can only be called on boolean expressions'>;
and<E extends OperandExpression<SqlBool>>(expression: E): T extends SqlBool ? AndWrapper<DB, TB, SqlBool> : KyselyTypeError<'and() method can only be called on boolean expressions'>;
/**
* Change the output type of the expression.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of this `ExpressionWrapper` with a new output type.
*/
$castTo<C>(): ExpressionWrapper<DB, TB, C>;
/**
* Omit null from the expression's type.
*
* This function can be useful in cases where you know an expression can't be
* null, but Kysely is unable to infer it.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of `this` with a new output type.
*/
$notNull(): ExpressionWrapper<DB, TB, Exclude<T, null>>;
/**
* Creates the OperationNode that describes how to compile this expression into SQL.
*
* ### Examples
*
* If you are creating a custom expression, it's often easiest to use the {@link sql}
* template tag to build the node:
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*/
toOperationNode(): OperationNode;
}
export declare class AliasedExpressionWrapper<T, A extends string> implements AliasedExpression<T, A> {
#private;
constructor(expr: Expression<T>, alias: A | Expression<unknown>);
/** @private */
/**
* Returns the aliased expression.
*/
get expression(): Expression<T>;
/** @private */
/**
* Returns the alias.
*/
get alias(): A | Expression<unknown>;
/**
* Creates the OperationNode that describes how to compile this expression into SQL.
*/
toOperationNode(): AliasNode;
}
export declare class OrWrapper<DB, TB extends keyof DB, T extends SqlBool> implements AliasableExpression<T> {
#private;
constructor(node: OrNode);
/** @private */
/**
* All expressions need to have this getter for complicated type-related reasons.
* Simply add this getter for your expression and always return `undefined` from it:
*
* ### Examples
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*
* The getter is needed to make the expression assignable to another expression only
* if the types `T` are assignable. Without this property (or some other property
* that references `T`), you could assing `Expression<string>` to `Expression<number>`.
*/
get expressionType(): T | undefined;
/**
* Returns an aliased version of the expression.
*
* In addition to slapping `as "the_alias"` to the end of the SQL,
* this method also provides strict typing:
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select(eb =>
* eb('first_name', '=', 'Jennifer')
* .or('first_name', '=', 'Sylvester')
* .as('is_jennifer_or_sylvester')
* )
* .executeTakeFirstOrThrow()
*
* // `is_jennifer_or_sylvester: SqlBool` field exists in the result type.
* console.log(result.is_jennifer_or_sylvester)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "first_name" = $1 or "first_name" = $2 as "is_jennifer_or_sylvester"
* from "person"
* ```
*/
as<A extends string>(alias: A): AliasedExpression<T, A>;
/**
* Returns an aliased version of the expression.
*
* ### Examples
*
* In addition to slapping `as "the_alias"` at the end of the expression,
* this method also provides strict typing:
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select((eb) =>
* // `eb.fn<string>` returns an AliasableExpression<string>
* eb.fn<string>('concat', ['first_name', eb.val(' '), 'last_name']).as('full_name')
* )
* .executeTakeFirstOrThrow()
*
* // `full_name: string` field exists in the result type.
* console.log(result.full_name)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* concat("first_name", $1, "last_name") as "full_name"
* from
* "person"
* ```
*
* You can also pass in a raw SQL snippet (or any expression) but in that case you must
* provide the alias as the only type argument:
*
* ```ts
* import { sql } from 'kysely'
*
* const values = sql<{ a: number, b: string }>`(values (1, 'foo'))`
*
* // The alias is `t(a, b)` which specifies the column names
* // in addition to the table name. We must tell kysely that
* // columns of the table can be referenced through `t`
* // by providing an explicit type argument.
* const aliasedValues = values.as<'t'>(sql`t(a, b)`)
*
* await db
* .insertInto('person')
* .columns(['first_name', 'last_name'])
* .expression(
* db.selectFrom(aliasedValues).select(['t.a', 't.b'])
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* insert into "person" ("first_name", "last_name")
* from (values (1, 'foo')) as t(a, b)
* select "t"."a", "t"."b"
* ```
*/
as<A extends string>(alias: Expression<unknown>): AliasedExpression<T, A>;
/**
* Combines `this` and another expression using `OR`.
*
* See {@link ExpressionWrapper.or} for examples.
*/
or<RE extends ReferenceExpression<DB, TB>, VE extends OperandValueExpressionOrList<DB, TB, RE>>(lhs: RE, op: ComparisonOperatorExpression, rhs: VE): OrWrapper<DB, TB, T>;
or<E extends OperandExpression<SqlBool>>(expression: E): OrWrapper<DB, TB, T>;
/**
* Change the output type of the expression.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of this `OrWrapper` with a new output type.
*/
$castTo<C extends SqlBool>(): OrWrapper<DB, TB, C>;
/**
* Creates the OperationNode that describes how to compile this expression into SQL.
*
* ### Examples
*
* If you are creating a custom expression, it's often easiest to use the {@link sql}
* template tag to build the node:
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*/
toOperationNode(): ParensNode;
}
export declare class AndWrapper<DB, TB extends keyof DB, T extends SqlBool> implements AliasableExpression<T> {
#private;
constructor(node: AndNode);
/** @private */
/**
* All expressions need to have this getter for complicated type-related reasons.
* Simply add this getter for your expression and always return `undefined` from it:
*
* ### Examples
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*
* The getter is needed to make the expression assignable to another expression only
* if the types `T` are assignable. Without this property (or some other property
* that references `T`), you could assing `Expression<string>` to `Expression<number>`.
*/
get expressionType(): T | undefined;
/**
* Returns an aliased version of the expression.
*
* In addition to slapping `as "the_alias"` to the end of the SQL,
* this method also provides strict typing:
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select(eb =>
* eb('first_name', '=', 'Jennifer')
* .and('last_name', '=', 'Aniston')
* .as('is_jennifer_aniston')
* )
* .executeTakeFirstOrThrow()
*
* // `is_jennifer_aniston: SqlBool` field exists in the result type.
* console.log(result.is_jennifer_aniston)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "first_name" = $1 and "first_name" = $2 as "is_jennifer_aniston"
* from "person"
* ```
*/
as<A extends string>(alias: A): AliasedExpression<T, A>;
/**
* Returns an aliased version of the expression.
*
* ### Examples
*
* In addition to slapping `as "the_alias"` at the end of the expression,
* this method also provides strict typing:
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select((eb) =>
* // `eb.fn<string>` returns an AliasableExpression<string>
* eb.fn<string>('concat', ['first_name', eb.val(' '), 'last_name']).as('full_name')
* )
* .executeTakeFirstOrThrow()
*
* // `full_name: string` field exists in the result type.
* console.log(result.full_name)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* concat("first_name", $1, "last_name") as "full_name"
* from
* "person"
* ```
*
* You can also pass in a raw SQL snippet (or any expression) but in that case you must
* provide the alias as the only type argument:
*
* ```ts
* import { sql } from 'kysely'
*
* const values = sql<{ a: number, b: string }>`(values (1, 'foo'))`
*
* // The alias is `t(a, b)` which specifies the column names
* // in addition to the table name. We must tell kysely that
* // columns of the table can be referenced through `t`
* // by providing an explicit type argument.
* const aliasedValues = values.as<'t'>(sql`t(a, b)`)
*
* await db
* .insertInto('person')
* .columns(['first_name', 'last_name'])
* .expression(
* db.selectFrom(aliasedValues).select(['t.a', 't.b'])
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* insert into "person" ("first_name", "last_name")
* from (values (1, 'foo')) as t(a, b)
* select "t"."a", "t"."b"
* ```
*/
as<A extends string>(alias: Expression<unknown>): AliasedExpression<T, A>;
/**
* Combines `this` and another expression using `AND`.
*
* See {@link ExpressionWrapper.and} for examples.
*/
and<RE extends ReferenceExpression<DB, TB>, VE extends OperandValueExpressionOrList<DB, TB, RE>>(lhs: RE, op: ComparisonOperatorExpression, rhs: VE): AndWrapper<DB, TB, T>;
and<E extends OperandExpression<SqlBool>>(expression: E): AndWrapper<DB, TB, T>;
/**
* Change the output type of the expression.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of this `AndWrapper` with a new output type.
*/
$castTo<C extends SqlBool>(): AndWrapper<DB, TB, C>;
/**
* Creates the OperationNode that describes how to compile this expression into SQL.
*
* ### Examples
*
* If you are creating a custom expression, it's often easiest to use the {@link sql}
* template tag to build the node:
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*/
toOperationNode(): ParensNode;
}

View File

@@ -0,0 +1,134 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.AndWrapper = exports.OrWrapper = exports.AliasedExpressionWrapper = exports.ExpressionWrapper = void 0;
const alias_node_js_1 = require("../operation-node/alias-node.js");
const and_node_js_1 = require("../operation-node/and-node.js");
const identifier_node_js_1 = require("../operation-node/identifier-node.js");
const operation_node_source_js_1 = require("../operation-node/operation-node-source.js");
const or_node_js_1 = require("../operation-node/or-node.js");
const parens_node_js_1 = require("../operation-node/parens-node.js");
const binary_operation_parser_js_1 = require("../parser/binary-operation-parser.js");
class ExpressionWrapper {
#node;
constructor(node) {
this.#node = node;
}
/** @private */
get expressionType() {
return undefined;
}
as(alias) {
return new AliasedExpressionWrapper(this, alias);
}
or(...args) {
return new OrWrapper(or_node_js_1.OrNode.create(this.#node, (0, binary_operation_parser_js_1.parseValueBinaryOperationOrExpression)(args)));
}
and(...args) {
return new AndWrapper(and_node_js_1.AndNode.create(this.#node, (0, binary_operation_parser_js_1.parseValueBinaryOperationOrExpression)(args)));
}
/**
* Change the output type of the expression.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of this `ExpressionWrapper` with a new output type.
*/
$castTo() {
return new ExpressionWrapper(this.#node);
}
/**
* Omit null from the expression's type.
*
* This function can be useful in cases where you know an expression can't be
* null, but Kysely is unable to infer it.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of `this` with a new output type.
*/
$notNull() {
return new ExpressionWrapper(this.#node);
}
toOperationNode() {
return this.#node;
}
}
exports.ExpressionWrapper = ExpressionWrapper;
class AliasedExpressionWrapper {
#expr;
#alias;
constructor(expr, alias) {
this.#expr = expr;
this.#alias = alias;
}
/** @private */
get expression() {
return this.#expr;
}
/** @private */
get alias() {
return this.#alias;
}
toOperationNode() {
return alias_node_js_1.AliasNode.create(this.#expr.toOperationNode(), (0, operation_node_source_js_1.isOperationNodeSource)(this.#alias)
? this.#alias.toOperationNode()
: identifier_node_js_1.IdentifierNode.create(this.#alias));
}
}
exports.AliasedExpressionWrapper = AliasedExpressionWrapper;
class OrWrapper {
#node;
constructor(node) {
this.#node = node;
}
/** @private */
get expressionType() {
return undefined;
}
as(alias) {
return new AliasedExpressionWrapper(this, alias);
}
or(...args) {
return new OrWrapper(or_node_js_1.OrNode.create(this.#node, (0, binary_operation_parser_js_1.parseValueBinaryOperationOrExpression)(args)));
}
/**
* Change the output type of the expression.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of this `OrWrapper` with a new output type.
*/
$castTo() {
return new OrWrapper(this.#node);
}
toOperationNode() {
return parens_node_js_1.ParensNode.create(this.#node);
}
}
exports.OrWrapper = OrWrapper;
class AndWrapper {
#node;
constructor(node) {
this.#node = node;
}
/** @private */
get expressionType() {
return undefined;
}
as(alias) {
return new AliasedExpressionWrapper(this, alias);
}
and(...args) {
return new AndWrapper(and_node_js_1.AndNode.create(this.#node, (0, binary_operation_parser_js_1.parseValueBinaryOperationOrExpression)(args)));
}
/**
* Change the output type of the expression.
*
* This method call doesn't change the SQL in any way. This methods simply
* returns a copy of this `AndWrapper` with a new output type.
*/
$castTo() {
return new AndWrapper(this.#node);
}
toOperationNode() {
return parens_node_js_1.ParensNode.create(this.#node);
}
}
exports.AndWrapper = AndWrapper;

198
node_modules/kysely/dist/cjs/expression/expression.d.ts generated vendored Normal file
View File

@@ -0,0 +1,198 @@
import type { AliasNode } from '../operation-node/alias-node.js';
import { type OperationNodeSource } from '../operation-node/operation-node-source.js';
import type { OperationNode } from '../operation-node/operation-node.js';
/**
* `Expression` represents an arbitrary SQL expression with a type.
*
* Most Kysely methods accept instances of `Expression` and most classes like `SelectQueryBuilder`
* and the return value of the {@link sql} template tag implement it.
*
* ### Examples
*
* ```ts
* import { type Expression, sql } from 'kysely'
*
* const exp1: Expression<string> = sql<string>`CONCAT('hello', ' ', 'world')`
* const exp2: Expression<{ first_name: string }> = db.selectFrom('person').select('first_name')
* ```
*
* You can implement the `Expression` interface to create your own type-safe utilities for Kysely.
*/
export interface Expression<T> extends OperationNodeSource {
/**
* All expressions need to have this getter for complicated type-related reasons.
* Simply add this getter for your expression and always return `undefined` from it:
*
* ### Examples
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*
* The getter is needed to make the expression assignable to another expression only
* if the types `T` are assignable. Without this property (or some other property
* that references `T`), you could assing `Expression<string>` to `Expression<number>`.
*/
get expressionType(): T | undefined;
/**
* Creates the OperationNode that describes how to compile this expression into SQL.
*
* ### Examples
*
* If you are creating a custom expression, it's often easiest to use the {@link sql}
* template tag to build the node:
*
* ```ts
* import { type Expression, type OperationNode, sql } from 'kysely'
*
* class SomeExpression<T> implements Expression<T> {
* get expressionType(): T | undefined {
* return undefined
* }
*
* toOperationNode(): OperationNode {
* return sql`some sql here`.toOperationNode()
* }
* }
* ```
*/
toOperationNode(): OperationNode;
}
/**
* An expression with an `as` method.
*/
export interface AliasableExpression<T> extends Expression<T> {
/**
* Returns an aliased version of the expression.
*
* ### Examples
*
* In addition to slapping `as "the_alias"` at the end of the expression,
* this method also provides strict typing:
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select((eb) =>
* // `eb.fn<string>` returns an AliasableExpression<string>
* eb.fn<string>('concat', ['first_name', eb.val(' '), 'last_name']).as('full_name')
* )
* .executeTakeFirstOrThrow()
*
* // `full_name: string` field exists in the result type.
* console.log(result.full_name)
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select
* concat("first_name", $1, "last_name") as "full_name"
* from
* "person"
* ```
*
* You can also pass in a raw SQL snippet (or any expression) but in that case you must
* provide the alias as the only type argument:
*
* ```ts
* import { sql } from 'kysely'
*
* const values = sql<{ a: number, b: string }>`(values (1, 'foo'))`
*
* // The alias is `t(a, b)` which specifies the column names
* // in addition to the table name. We must tell kysely that
* // columns of the table can be referenced through `t`
* // by providing an explicit type argument.
* const aliasedValues = values.as<'t'>(sql`t(a, b)`)
*
* await db
* .insertInto('person')
* .columns(['first_name', 'last_name'])
* .expression(
* db.selectFrom(aliasedValues).select(['t.a', 't.b'])
* )
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* insert into "person" ("first_name", "last_name")
* from (values (1, 'foo')) as t(a, b)
* select "t"."a", "t"."b"
* ```
*/
as<A extends string>(alias: A): AliasedExpression<T, A>;
as<A extends string>(alias: Expression<any>): AliasedExpression<T, A>;
}
/**
* A type that holds an expression and an alias for it.
*
* `AliasedExpression<T, A>` can be used in places where, in addition to the value type `T`, you
* also need a name `A` for that value. For example anything you can pass into the `select` method
* needs to implement an `AliasedExpression<T, A>`. `A` becomes the name of the selected expression
* in the result and `T` becomes its type.
*
* ### Examples
*
* ```ts
* import {
* AliasNode,
* type AliasedExpression,
* type Expression,
* IdentifierNode
* } from 'kysely'
*
* class SomeAliasedExpression<T, A extends string> implements AliasedExpression<T, A> {
* #expression: Expression<T>
* #alias: A
*
* constructor(expression: Expression<T>, alias: A) {
* this.#expression = expression
* this.#alias = alias
* }
*
* get expression(): Expression<T> {
* return this.#expression
* }
*
* get alias(): A {
* return this.#alias
* }
*
* toOperationNode(): AliasNode {
* return AliasNode.create(
* this.#expression.toOperationNode(),
* IdentifierNode.create(this.#alias)
* )
* }
* }
* ```
*/
export interface AliasedExpression<T, A extends string> extends OperationNodeSource {
/**
* Returns the aliased expression.
*/
get expression(): Expression<T>;
/**
* Returns the alias.
*/
get alias(): A | Expression<unknown>;
/**
* Creates the OperationNode that describes how to compile this expression into SQL.
*/
toOperationNode(): AliasNode;
}
export declare function isExpression(obj: unknown): obj is Expression<any>;
export declare function isAliasedExpression(obj: unknown): obj is AliasedExpression<any, any>;

15
node_modules/kysely/dist/cjs/expression/expression.js generated vendored Normal file
View File

@@ -0,0 +1,15 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.isExpression = isExpression;
exports.isAliasedExpression = isAliasedExpression;
const operation_node_source_js_1 = require("../operation-node/operation-node-source.js");
const object_utils_js_1 = require("../util/object-utils.js");
function isExpression(obj) {
return (0, object_utils_js_1.isObject)(obj) && 'expressionType' in obj && (0, operation_node_source_js_1.isOperationNodeSource)(obj);
}
function isAliasedExpression(obj) {
return ((0, object_utils_js_1.isObject)(obj) &&
'expression' in obj &&
(0, object_utils_js_1.isString)(obj.alias) &&
(0, operation_node_source_js_1.isOperationNodeSource)(obj));
}

212
node_modules/kysely/dist/cjs/helpers/mssql.d.ts generated vendored Normal file
View File

@@ -0,0 +1,212 @@
import type { Expression } from '../expression/expression.js';
import type { RawBuilder } from '../raw-builder/raw-builder.js';
import type { ShallowDehydrateObject, ShallowDehydrateValue, Simplify } from '../util/type-utils.js';
/**
* An MS SQL Server helper for aggregating a subquery into a JSON array.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import { Kysely, MssqlDialect, ParseJSONResultsPlugin } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new MssqlDialect({
* tarn: { options: { max: 10, min: 0 }, ...Tarn },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: { password: 'password', userName: 'sa' },
* type: 'default',
* },
* options: { database: 'test', port: 21433, trustServerCertificate: true },
* server: 'localhost',
* }),
* },
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/mssql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* .offset(0)
* ).as('pets')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.pets[0]?.pet_id
* result[0]?.pets[0]?.name
* ```
*
* The generated SQL (MS SQL Server):
*
* ```sql
* select "id", (
* select coalesce((select * from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* order by "pet"."name"
* offset @1 rows
* ) as "agg" for json path, include_null_values), '[]')
* ) as "pets"
* from "person"
* ```
*/
export declare function jsonArrayFrom<O>(expr: Expression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>>[]>;
/**
* An MS SQL Server helper for turning a subquery into a JSON object.
*
* The subquery must only return one row.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import { Kysely, MssqlDialect, ParseJSONResultsPlugin } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new MssqlDialect({
* tarn: { options: { max: 10, min: 0 }, ...Tarn },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: { password: 'password', userName: 'sa' },
* type: 'default',
* },
* options: { database: 'test', port: 21433, trustServerCertificate: true },
* server: 'localhost',
* }),
* },
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/mssql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', 1)
* ).as('favorite_pet')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.favorite_pet?.pet_id
* result[0]?.favorite_pet?.name
* ```
*
* The generated SQL (MS SQL Server):
*
* ```sql
* select "id", (
* select * from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* and "pet"."is_favorite" = @1
* ) as "agg" for json path, include_null_values, without_array_wrapper
* ) as "favorite_pet"
* from "person"
* ```
*/
export declare function jsonObjectFrom<O>(expr: Expression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>> | null>;
/**
* The MS SQL Server `json_query` function, single argument variant.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import { Kysely, MssqlDialect, ParseJSONResultsPlugin } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new MssqlDialect({
* tarn: { options: { max: 10, min: 0 }, ...Tarn },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: { password: 'password', userName: 'sa' },
* type: 'default',
* },
* options: { database: 'test', port: 21433, trustServerCertificate: true },
* server: 'localhost',
* }),
* },
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonBuildObject } from 'kysely/helpers/mssql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: eb.fn('concat', ['first_name', eb.val(' '), 'last_name'])
* }).as('name')
* ])
* .execute()
* ```
*
* The generated SQL (MS SQL Server):
*
* ```sql
* select "id", json_query(
* '{"first":"'+"first_name"+',"last":"'+"last_name"+',"full":"'+concat("first_name", ' ', "last_name")+'"}'
* ) as "name"
* from "person"
* ```
*/
export declare function jsonBuildObject<O extends Record<string, Expression<unknown>>>(obj: O): RawBuilder<Simplify<{
[K in keyof O]: O[K] extends Expression<infer V> ? ShallowDehydrateValue<V> : never;
}>>;

219
node_modules/kysely/dist/cjs/helpers/mssql.js generated vendored Normal file
View File

@@ -0,0 +1,219 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.jsonArrayFrom = jsonArrayFrom;
exports.jsonObjectFrom = jsonObjectFrom;
exports.jsonBuildObject = jsonBuildObject;
const sql_js_1 = require("../raw-builder/sql.js");
/**
* An MS SQL Server helper for aggregating a subquery into a JSON array.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import { Kysely, MssqlDialect, ParseJSONResultsPlugin } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new MssqlDialect({
* tarn: { options: { max: 10, min: 0 }, ...Tarn },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: { password: 'password', userName: 'sa' },
* type: 'default',
* },
* options: { database: 'test', port: 21433, trustServerCertificate: true },
* server: 'localhost',
* }),
* },
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/mssql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* .offset(0)
* ).as('pets')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.pets[0]?.pet_id
* result[0]?.pets[0]?.name
* ```
*
* The generated SQL (MS SQL Server):
*
* ```sql
* select "id", (
* select coalesce((select * from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* order by "pet"."name"
* offset @1 rows
* ) as "agg" for json path, include_null_values), '[]')
* ) as "pets"
* from "person"
* ```
*/
function jsonArrayFrom(expr) {
return (0, sql_js_1.sql) `coalesce((select * from ${expr} as agg for json path, include_null_values), '[]')`;
}
/**
* An MS SQL Server helper for turning a subquery into a JSON object.
*
* The subquery must only return one row.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import { Kysely, MssqlDialect, ParseJSONResultsPlugin } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new MssqlDialect({
* tarn: { options: { max: 10, min: 0 }, ...Tarn },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: { password: 'password', userName: 'sa' },
* type: 'default',
* },
* options: { database: 'test', port: 21433, trustServerCertificate: true },
* server: 'localhost',
* }),
* },
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/mssql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', 1)
* ).as('favorite_pet')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.favorite_pet?.pet_id
* result[0]?.favorite_pet?.name
* ```
*
* The generated SQL (MS SQL Server):
*
* ```sql
* select "id", (
* select * from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* and "pet"."is_favorite" = @1
* ) as "agg" for json path, include_null_values, without_array_wrapper
* ) as "favorite_pet"
* from "person"
* ```
*/
function jsonObjectFrom(expr) {
return (0, sql_js_1.sql) `(select * from ${expr} as agg for json path, include_null_values, without_array_wrapper)`;
}
/**
* The MS SQL Server `json_query` function, single argument variant.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import { Kysely, MssqlDialect, ParseJSONResultsPlugin } from 'kysely'
* import * as Tarn from 'tarn'
* import * as Tedious from 'tedious'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new MssqlDialect({
* tarn: { options: { max: 10, min: 0 }, ...Tarn },
* tedious: {
* ...Tedious,
* connectionFactory: () => new Tedious.Connection({
* authentication: {
* options: { password: 'password', userName: 'sa' },
* type: 'default',
* },
* options: { database: 'test', port: 21433, trustServerCertificate: true },
* server: 'localhost',
* }),
* },
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonBuildObject } from 'kysely/helpers/mssql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: eb.fn('concat', ['first_name', eb.val(' '), 'last_name'])
* }).as('name')
* ])
* .execute()
* ```
*
* The generated SQL (MS SQL Server):
*
* ```sql
* select "id", json_query(
* '{"first":"'+"first_name"+',"last":"'+"last_name"+',"full":"'+concat("first_name", ' ', "last_name")+'"}'
* ) as "name"
* from "person"
* ```
*/
function jsonBuildObject(obj) {
return (0, sql_js_1.sql) `json_query('{${sql_js_1.sql.join(Object.keys(obj).map((k) => (0, sql_js_1.sql) `"${sql_js_1.sql.raw(k)}":"'+${obj[k]}+'"`), (0, sql_js_1.sql) `,`)}}')`;
}

147
node_modules/kysely/dist/cjs/helpers/mysql.d.ts generated vendored Normal file
View File

@@ -0,0 +1,147 @@
import type { Expression } from '../expression/expression.js';
import type { SelectQueryBuilderExpression } from '../query-builder/select-query-builder-expression.js';
import type { RawBuilder } from '../raw-builder/raw-builder.js';
import type { ShallowDehydrateObject, ShallowDehydrateValue, Simplify } from '../util/type-utils.js';
/**
* A MySQL helper for aggregating a subquery into a JSON array.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `MysqlDialect`.
* While the produced SQL is compatible with all MySQL databases, some third-party dialects
* may not parse the nested JSON into arrays. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/mysql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* ).as('pets')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.pets[0]?.pet_id
* result[0]?.pets[0]?.name
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* select `id`, (
* select cast(coalesce(json_arrayagg(json_object(
* 'pet_id', `agg`.`pet_id`,
* 'name', `agg`.`name`
* )), '[]') as json) from (
* select `pet`.`id` as `pet_id`, `pet`.`name`
* from `pet`
* where `pet`.`owner_id` = `person`.`id`
* order by `pet`.`name`
* ) as `agg`
* ) as `pets`
* from `person`
* ```
*/
export declare function jsonArrayFrom<O>(expr: SelectQueryBuilderExpression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>>[]>;
/**
* A MySQL helper for turning a subquery into a JSON object.
*
* The subquery must only return one row.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `MysqlDialect`.
* While the produced SQL is compatible with all MySQL databases, some third-party dialects
* may not parse the nested JSON into objects. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/mysql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', true)
* ).as('favorite_pet')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.favorite_pet?.pet_id
* result[0]?.favorite_pet?.name
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* select `id`, (
* select json_object(
* 'pet_id', `obj`.`pet_id`,
* 'name', `obj`.`name`
* ) from (
* select `pet`.`id` as `pet_id`, `pet`.`name`
* from `pet`
* where `pet`.`owner_id` = `person`.`id`
* and `pet`.`is_favorite` = ?
* ) as obj
* ) as `favorite_pet`
* from `person`
* ```
*/
export declare function jsonObjectFrom<O>(expr: SelectQueryBuilderExpression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>> | null>;
/**
* The MySQL `json_object` function.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `MysqlDialect`.
* While the produced SQL is compatible with all MySQL databases, some third-party dialects
* may not parse the nested JSON into objects. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: eb.fn('concat', ['first_name', eb.val(' '), 'last_name'])
* }).as('name')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.name.first
* result[0]?.name.last
* result[0]?.name.full
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* select "id", json_object(
* 'first', first_name,
* 'last', last_name,
* 'full', concat(`first_name`, ?, `last_name`)
* ) as "name"
* from "person"
* ```
*/
export declare function jsonBuildObject<O extends Record<string, Expression<unknown>>>(obj: O): RawBuilder<Simplify<{
[K in keyof O]: O[K] extends Expression<infer V> ? ShallowDehydrateValue<V> : never;
}>>;

162
node_modules/kysely/dist/cjs/helpers/mysql.js generated vendored Normal file
View File

@@ -0,0 +1,162 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.jsonArrayFrom = jsonArrayFrom;
exports.jsonObjectFrom = jsonObjectFrom;
exports.jsonBuildObject = jsonBuildObject;
const sql_js_1 = require("../raw-builder/sql.js");
const json_object_args_js_1 = require("../util/json-object-args.js");
/**
* A MySQL helper for aggregating a subquery into a JSON array.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `MysqlDialect`.
* While the produced SQL is compatible with all MySQL databases, some third-party dialects
* may not parse the nested JSON into arrays. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/mysql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* ).as('pets')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.pets[0]?.pet_id
* result[0]?.pets[0]?.name
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* select `id`, (
* select cast(coalesce(json_arrayagg(json_object(
* 'pet_id', `agg`.`pet_id`,
* 'name', `agg`.`name`
* )), '[]') as json) from (
* select `pet`.`id` as `pet_id`, `pet`.`name`
* from `pet`
* where `pet`.`owner_id` = `person`.`id`
* order by `pet`.`name`
* ) as `agg`
* ) as `pets`
* from `person`
* ```
*/
function jsonArrayFrom(expr) {
return (0, sql_js_1.sql) `(select cast(coalesce(json_arrayagg(json_object(${sql_js_1.sql.join(getMysqlJsonObjectArgs(expr.toOperationNode(), 'agg'))})), '[]') as json) from ${expr} as agg)`;
}
/**
* A MySQL helper for turning a subquery into a JSON object.
*
* The subquery must only return one row.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `MysqlDialect`.
* While the produced SQL is compatible with all MySQL databases, some third-party dialects
* may not parse the nested JSON into objects. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/mysql'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', true)
* ).as('favorite_pet')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.favorite_pet?.pet_id
* result[0]?.favorite_pet?.name
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* select `id`, (
* select json_object(
* 'pet_id', `obj`.`pet_id`,
* 'name', `obj`.`name`
* ) from (
* select `pet`.`id` as `pet_id`, `pet`.`name`
* from `pet`
* where `pet`.`owner_id` = `person`.`id`
* and `pet`.`is_favorite` = ?
* ) as obj
* ) as `favorite_pet`
* from `person`
* ```
*/
function jsonObjectFrom(expr) {
return (0, sql_js_1.sql) `(select json_object(${sql_js_1.sql.join(getMysqlJsonObjectArgs(expr.toOperationNode(), 'obj'))}) from ${expr} as obj)`;
}
/**
* The MySQL `json_object` function.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `MysqlDialect`.
* While the produced SQL is compatible with all MySQL databases, some third-party dialects
* may not parse the nested JSON into objects. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: eb.fn('concat', ['first_name', eb.val(' '), 'last_name'])
* }).as('name')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.name.first
* result[0]?.name.last
* result[0]?.name.full
* ```
*
* The generated SQL (MySQL):
*
* ```sql
* select "id", json_object(
* 'first', first_name,
* 'last', last_name,
* 'full', concat(`first_name`, ?, `last_name`)
* ) as "name"
* from "person"
* ```
*/
function jsonBuildObject(obj) {
return (0, sql_js_1.sql) `json_object(${sql_js_1.sql.join(Object.keys(obj).flatMap((k) => [sql_js_1.sql.lit(k), obj[k]]))})`;
}
function getMysqlJsonObjectArgs(node, table) {
try {
return (0, json_object_args_js_1.getJsonObjectArgs)(node, table);
}
catch {
throw new Error('MySQL jsonArrayFrom and jsonObjectFrom functions can only handle explicit selections due to limitations of the json_object function. selectAll() is not allowed in the subquery.');
}
}

199
node_modules/kysely/dist/cjs/helpers/postgres.d.ts generated vendored Normal file
View File

@@ -0,0 +1,199 @@
import type { Expression } from '../expression/expression.js';
import type { RawBuilder } from '../raw-builder/raw-builder.js';
import type { ShallowDehydrateValue, ShallowDehydrateObject, Simplify } from '../util/type-utils.js';
/**
* A postgres helper for aggregating a subquery (or other expression) into a JSONB array.
*
* ### Examples
*
* <!-- siteExample("select", "Nested array", 110) -->
*
* While kysely is not an ORM and it doesn't have the concept of relations, we do provide
* helpers for fetching nested objects and arrays in a single query. In this example we
* use the `jsonArrayFrom` helper to fetch person's pets along with the person's id.
*
* Please keep in mind that the helpers under the `kysely/helpers` folder, including
* `jsonArrayFrom`, are not guaranteed to work with third party dialects. In order for
* them to work, the dialect must automatically parse the `json` data type into
* JavaScript JSON values like objects and arrays. Some dialects might simply return
* the data as a JSON string. In these cases you can use the built in `ParseJSONResultsPlugin`
* to parse the results.
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/postgres'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* ).as('pets')
* ])
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "id", (
* select coalesce(json_agg(agg), '[]') from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* order by "pet"."name"
* ) as agg
* ) as "pets"
* from "person"
* ```
*/
export declare function jsonArrayFrom<O>(expr: Expression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>>[]>;
/**
* A postgres helper for turning a subquery (or other expression) into a JSON object.
*
* The subquery must only return one row.
*
* ### Examples
*
* <!-- siteExample("select", "Nested object", 120) -->
*
* While kysely is not an ORM and it doesn't have the concept of relations, we do provide
* helpers for fetching nested objects and arrays in a single query. In this example we
* use the `jsonObjectFrom` helper to fetch person's favorite pet along with the person's id.
*
* Please keep in mind that the helpers under the `kysely/helpers` folder, including
* `jsonObjectFrom`, are not guaranteed to work with third-party dialects. In order for
* them to work, the dialect must automatically parse the `json` data type into
* JavaScript JSON values like objects and arrays. Some dialects might simply return
* the data as a JSON string. In these cases you can use the built in `ParseJSONResultsPlugin`
* to parse the results.
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/postgres'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', true)
* ).as('favorite_pet')
* ])
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "id", (
* select to_json(obj) from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* and "pet"."is_favorite" = $1
* ) as obj
* ) as "favorite_pet"
* from "person"
* ```
*/
export declare function jsonObjectFrom<O>(expr: Expression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>> | null>;
/**
* The PostgreSQL `json_build_object` function.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `PostgresDialect`.
* While the produced SQL is compatible with all PostgreSQL databases, some third-party dialects
* may not parse the nested JSON into objects. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* import { sql } from 'kysely'
* import { jsonBuildObject } from 'kysely/helpers/postgres'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: sql<string>`first_name || ' ' || last_name`
* }).as('name')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.name.first
* result[0]?.name.last
* result[0]?.name.full
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "id", json_build_object(
* 'first', first_name,
* 'last', last_name,
* 'full', first_name || ' ' || last_name
* ) as "name"
* from "person"
* ```
*/
export declare function jsonBuildObject<O extends Record<string, Expression<unknown>>>(obj: O): RawBuilder<Simplify<{
[K in keyof O]: O[K] extends Expression<infer V> ? ShallowDehydrateValue<V> : never;
}>>;
export type MergeAction = 'INSERT' | 'UPDATE' | 'DELETE';
/**
* The PostgreSQL `merge_action` function.
*
* This function can be used in a `returning` clause to get the action that was
* performed in a `mergeInto` query. The function returns one of the following
* strings: `'INSERT'`, `'UPDATE'`, or `'DELETE'`.
*
* ### Examples
*
* ```ts
* import { mergeAction } from 'kysely/helpers/postgres'
*
* const result = await db
* .mergeInto('person as p')
* .using('person_backup as pb', 'p.id', 'pb.id')
* .whenMatched()
* .thenUpdateSet(({ ref }) => ({
* first_name: ref('pb.first_name'),
* updated_at: ref('pb.updated_at').$castTo<string | null>(),
* }))
* .whenNotMatched()
* .thenInsertValues(({ ref}) => ({
* id: ref('pb.id'),
* first_name: ref('pb.first_name'),
* created_at: ref('pb.updated_at'),
* updated_at: ref('pb.updated_at').$castTo<string | null>(),
* }))
* .returning([mergeAction().as('action'), 'p.id', 'p.updated_at'])
* .execute()
*
* result[0].action
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* merge into "person" as "p"
* using "person_backup" as "pb" on "p"."id" = "pb"."id"
* when matched then update set
* "first_name" = "pb"."first_name",
* "updated_at" = "pb"."updated_at"::text
* when not matched then insert values ("id", "first_name", "created_at", "updated_at")
* values ("pb"."id", "pb"."first_name", "pb"."updated_at", "pb"."updated_at")
* returning merge_action() as "action", "p"."id", "p"."updated_at"
* ```
*/
export declare function mergeAction(): RawBuilder<MergeAction>;

208
node_modules/kysely/dist/cjs/helpers/postgres.js generated vendored Normal file
View File

@@ -0,0 +1,208 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.jsonArrayFrom = jsonArrayFrom;
exports.jsonObjectFrom = jsonObjectFrom;
exports.jsonBuildObject = jsonBuildObject;
exports.mergeAction = mergeAction;
const sql_js_1 = require("../raw-builder/sql.js");
/**
* A postgres helper for aggregating a subquery (or other expression) into a JSONB array.
*
* ### Examples
*
* <!-- siteExample("select", "Nested array", 110) -->
*
* While kysely is not an ORM and it doesn't have the concept of relations, we do provide
* helpers for fetching nested objects and arrays in a single query. In this example we
* use the `jsonArrayFrom` helper to fetch person's pets along with the person's id.
*
* Please keep in mind that the helpers under the `kysely/helpers` folder, including
* `jsonArrayFrom`, are not guaranteed to work with third party dialects. In order for
* them to work, the dialect must automatically parse the `json` data type into
* JavaScript JSON values like objects and arrays. Some dialects might simply return
* the data as a JSON string. In these cases you can use the built in `ParseJSONResultsPlugin`
* to parse the results.
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/postgres'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* ).as('pets')
* ])
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "id", (
* select coalesce(json_agg(agg), '[]') from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* order by "pet"."name"
* ) as agg
* ) as "pets"
* from "person"
* ```
*/
function jsonArrayFrom(expr) {
return (0, sql_js_1.sql) `(select coalesce(json_agg(agg), '[]') from ${expr} as agg)`;
}
/**
* A postgres helper for turning a subquery (or other expression) into a JSON object.
*
* The subquery must only return one row.
*
* ### Examples
*
* <!-- siteExample("select", "Nested object", 120) -->
*
* While kysely is not an ORM and it doesn't have the concept of relations, we do provide
* helpers for fetching nested objects and arrays in a single query. In this example we
* use the `jsonObjectFrom` helper to fetch person's favorite pet along with the person's id.
*
* Please keep in mind that the helpers under the `kysely/helpers` folder, including
* `jsonObjectFrom`, are not guaranteed to work with third-party dialects. In order for
* them to work, the dialect must automatically parse the `json` data type into
* JavaScript JSON values like objects and arrays. Some dialects might simply return
* the data as a JSON string. In these cases you can use the built in `ParseJSONResultsPlugin`
* to parse the results.
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/postgres'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', true)
* ).as('favorite_pet')
* ])
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "id", (
* select to_json(obj) from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* and "pet"."is_favorite" = $1
* ) as obj
* ) as "favorite_pet"
* from "person"
* ```
*/
function jsonObjectFrom(expr) {
return (0, sql_js_1.sql) `(select to_json(obj) from ${expr} as obj)`;
}
/**
* The PostgreSQL `json_build_object` function.
*
* NOTE: This helper is only guaranteed to fully work with the built-in `PostgresDialect`.
* While the produced SQL is compatible with all PostgreSQL databases, some third-party dialects
* may not parse the nested JSON into objects. In these cases you can use the built in
* `ParseJSONResultsPlugin` to parse the results.
*
* ### Examples
*
* ```ts
* import { sql } from 'kysely'
* import { jsonBuildObject } from 'kysely/helpers/postgres'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: sql<string>`first_name || ' ' || last_name`
* }).as('name')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.name.first
* result[0]?.name.last
* result[0]?.name.full
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "id", json_build_object(
* 'first', first_name,
* 'last', last_name,
* 'full', first_name || ' ' || last_name
* ) as "name"
* from "person"
* ```
*/
function jsonBuildObject(obj) {
return (0, sql_js_1.sql) `json_build_object(${sql_js_1.sql.join(Object.keys(obj).flatMap((k) => [sql_js_1.sql.lit(k), obj[k]]))})`;
}
/**
* The PostgreSQL `merge_action` function.
*
* This function can be used in a `returning` clause to get the action that was
* performed in a `mergeInto` query. The function returns one of the following
* strings: `'INSERT'`, `'UPDATE'`, or `'DELETE'`.
*
* ### Examples
*
* ```ts
* import { mergeAction } from 'kysely/helpers/postgres'
*
* const result = await db
* .mergeInto('person as p')
* .using('person_backup as pb', 'p.id', 'pb.id')
* .whenMatched()
* .thenUpdateSet(({ ref }) => ({
* first_name: ref('pb.first_name'),
* updated_at: ref('pb.updated_at').$castTo<string | null>(),
* }))
* .whenNotMatched()
* .thenInsertValues(({ ref}) => ({
* id: ref('pb.id'),
* first_name: ref('pb.first_name'),
* created_at: ref('pb.updated_at'),
* updated_at: ref('pb.updated_at').$castTo<string | null>(),
* }))
* .returning([mergeAction().as('action'), 'p.id', 'p.updated_at'])
* .execute()
*
* result[0].action
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* merge into "person" as "p"
* using "person_backup" as "pb" on "p"."id" = "pb"."id"
* when matched then update set
* "first_name" = "pb"."first_name",
* "updated_at" = "pb"."updated_at"::text
* when not matched then insert values ("id", "first_name", "created_at", "updated_at")
* values ("pb"."id", "pb"."first_name", "pb"."updated_at", "pb"."updated_at")
* returning merge_action() as "action", "p"."id", "p"."updated_at"
* ```
*/
function mergeAction() {
return (0, sql_js_1.sql) `merge_action()`;
}

189
node_modules/kysely/dist/cjs/helpers/sqlite.d.ts generated vendored Normal file
View File

@@ -0,0 +1,189 @@
import type { Expression } from '../expression/expression.js';
import type { SelectQueryBuilderExpression } from '../query-builder/select-query-builder-expression.js';
import type { RawBuilder } from '../raw-builder/raw-builder.js';
import type { ShallowDehydrateObject, ShallowDehydrateValue, Simplify } from '../util/type-utils.js';
/**
* A SQLite helper for aggregating a subquery into a JSON array.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, ParseJSONResultsPlugin, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:')
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/sqlite'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* ).as('pets')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.pets[0].pet_id
* result[0]?.pets[0].name
* ```
*
* The generated SQL (SQLite):
*
* ```sql
* select "id", (
* select coalesce(json_group_array(json_object(
* 'pet_id', "agg"."pet_id",
* 'name', "agg"."name"
* )), '[]') from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* order by "pet"."name"
* ) as "agg"
* ) as "pets"
* from "person"
* ```
*/
export declare function jsonArrayFrom<O>(expr: SelectQueryBuilderExpression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>>[]>;
/**
* A SQLite helper for turning a subquery into a JSON object.
*
* The subquery must only return one row.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, ParseJSONResultsPlugin, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:')
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/sqlite'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', true)
* ).as('favorite_pet')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.favorite_pet?.pet_id
* result[0]?.favorite_pet?.name
* ```
*
* The generated SQL (SQLite):
*
* ```sql
* select "id", (
* select json_object(
* 'pet_id', "obj"."pet_id",
* 'name', "obj"."name"
* ) from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* and "pet"."is_favorite" = ?
* ) as obj
* ) as "favorite_pet"
* from "person";
* ```
*/
export declare function jsonObjectFrom<O>(expr: SelectQueryBuilderExpression<O>): RawBuilder<Simplify<ShallowDehydrateObject<O>> | null>;
/**
* The SQLite `json_object` function.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, ParseJSONResultsPlugin, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:')
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { sql } from 'kysely'
* import { jsonBuildObject } from 'kysely/helpers/sqlite'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: sql<string>`first_name || ' ' || last_name`
* }).as('name')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.name.first
* result[0]?.name.last
* result[0]?.name.full
* ```
*
* The generated SQL (SQLite):
*
* ```sql
* select "id", json_object(
* 'first', first_name,
* 'last', last_name,
* 'full', "first_name" || ' ' || "last_name"
* ) as "name"
* from "person"
* ```
*/
export declare function jsonBuildObject<O extends Record<string, Expression<unknown>>>(obj: O): RawBuilder<Simplify<{
[K in keyof O]: O[K] extends Expression<infer V> ? ShallowDehydrateValue<V> : never;
}>>;

204
node_modules/kysely/dist/cjs/helpers/sqlite.js generated vendored Normal file
View File

@@ -0,0 +1,204 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.jsonArrayFrom = jsonArrayFrom;
exports.jsonObjectFrom = jsonObjectFrom;
exports.jsonBuildObject = jsonBuildObject;
const sql_js_1 = require("../raw-builder/sql.js");
const json_object_args_js_1 = require("../util/json-object-args.js");
/**
* A SQLite helper for aggregating a subquery into a JSON array.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, ParseJSONResultsPlugin, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:')
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonArrayFrom } from 'kysely/helpers/sqlite'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonArrayFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .orderBy('pet.name')
* ).as('pets')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.pets[0].pet_id
* result[0]?.pets[0].name
* ```
*
* The generated SQL (SQLite):
*
* ```sql
* select "id", (
* select coalesce(json_group_array(json_object(
* 'pet_id', "agg"."pet_id",
* 'name', "agg"."name"
* )), '[]') from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* order by "pet"."name"
* ) as "agg"
* ) as "pets"
* from "person"
* ```
*/
function jsonArrayFrom(expr) {
return (0, sql_js_1.sql) `(select coalesce(json_group_array(json_object(${sql_js_1.sql.join(getSqliteJsonObjectArgs(expr.toOperationNode(), 'agg'))})), '[]') from ${expr} as agg)`;
}
/**
* A SQLite helper for turning a subquery into a JSON object.
*
* The subquery must only return one row.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, ParseJSONResultsPlugin, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:')
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { jsonObjectFrom } from 'kysely/helpers/sqlite'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonObjectFrom(
* eb.selectFrom('pet')
* .select(['pet.id as pet_id', 'pet.name'])
* .whereRef('pet.owner_id', '=', 'person.id')
* .where('pet.is_favorite', '=', true)
* ).as('favorite_pet')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.favorite_pet?.pet_id
* result[0]?.favorite_pet?.name
* ```
*
* The generated SQL (SQLite):
*
* ```sql
* select "id", (
* select json_object(
* 'pet_id', "obj"."pet_id",
* 'name', "obj"."name"
* ) from (
* select "pet"."id" as "pet_id", "pet"."name"
* from "pet"
* where "pet"."owner_id" = "person"."id"
* and "pet"."is_favorite" = ?
* ) as obj
* ) as "favorite_pet"
* from "person";
* ```
*/
function jsonObjectFrom(expr) {
return (0, sql_js_1.sql) `(select json_object(${sql_js_1.sql.join(getSqliteJsonObjectArgs(expr.toOperationNode(), 'obj'))}) from ${expr} as obj)`;
}
/**
* The SQLite `json_object` function.
*
* NOTE: This helper only works correctly if you've installed the `ParseJSONResultsPlugin`.
* Otherwise the nested selections will be returned as JSON strings.
*
* The plugin can be installed like this:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, ParseJSONResultsPlugin, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:')
* }),
* plugins: [new ParseJSONResultsPlugin()]
* })
* ```
*
* ### Examples
*
* ```ts
* import { sql } from 'kysely'
* import { jsonBuildObject } from 'kysely/helpers/sqlite'
*
* const result = await db
* .selectFrom('person')
* .select((eb) => [
* 'id',
* jsonBuildObject({
* first: eb.ref('first_name'),
* last: eb.ref('last_name'),
* full: sql<string>`first_name || ' ' || last_name`
* }).as('name')
* ])
* .execute()
*
* result[0]?.id
* result[0]?.name.first
* result[0]?.name.last
* result[0]?.name.full
* ```
*
* The generated SQL (SQLite):
*
* ```sql
* select "id", json_object(
* 'first', first_name,
* 'last', last_name,
* 'full', "first_name" || ' ' || "last_name"
* ) as "name"
* from "person"
* ```
*/
function jsonBuildObject(obj) {
return (0, sql_js_1.sql) `json_object(${sql_js_1.sql.join(Object.keys(obj).flatMap((k) => [sql_js_1.sql.lit(k), obj[k]]))})`;
}
function getSqliteJsonObjectArgs(node, table) {
try {
return (0, json_object_args_js_1.getJsonObjectArgs)(node, table);
}
catch {
throw new Error('SQLite jsonArrayFrom and jsonObjectFrom functions can only handle explicit selections due to limitations of the json_object function. selectAll() is not allowed in the subquery.');
}
}

226
node_modules/kysely/dist/cjs/index.d.ts generated vendored Normal file
View File

@@ -0,0 +1,226 @@
export * from './kysely.js';
export * from './query-creator.js';
export * from './expression/expression.js';
export { type ExpressionBuilder, expressionBuilder, } from './expression/expression-builder.js';
export * from './expression/expression-wrapper.js';
export * from './query-builder/where-interface.js';
export * from './query-builder/returning-interface.js';
export * from './query-builder/output-interface.js';
export * from './query-builder/having-interface.js';
export * from './query-builder/order-by-interface.js';
export * from './query-builder/select-query-builder.js';
export * from './query-builder/insert-query-builder.js';
export * from './query-builder/update-query-builder.js';
export * from './query-builder/delete-query-builder.js';
export * from './query-builder/no-result-error.js';
export * from './query-builder/join-builder.js';
export * from './query-builder/function-module.js';
export * from './query-builder/insert-result.js';
export * from './query-builder/delete-result.js';
export * from './query-builder/update-result.js';
export * from './query-builder/on-conflict-builder.js';
export * from './query-builder/aggregate-function-builder.js';
export * from './query-builder/case-builder.js';
export * from './query-builder/json-path-builder.js';
export * from './query-builder/merge-query-builder.js';
export * from './query-builder/merge-result.js';
export * from './query-builder/order-by-item-builder.js';
export * from './raw-builder/raw-builder.js';
export * from './raw-builder/sql.js';
export * from './query-executor/query-executor.js';
export * from './query-executor/default-query-executor.js';
export * from './query-executor/noop-query-executor.js';
export * from './query-executor/query-executor-provider.js';
export * from './query-compiler/default-query-compiler.js';
export * from './query-compiler/compiled-query.js';
export * from './schema/schema.js';
export * from './schema/create-table-builder.js';
export * from './schema/create-type-builder.js';
export * from './schema/drop-table-builder.js';
export * from './schema/drop-type-builder.js';
export * from './schema/create-index-builder.js';
export * from './schema/drop-index-builder.js';
export * from './schema/create-schema-builder.js';
export * from './schema/drop-schema-builder.js';
export * from './schema/column-definition-builder.js';
export * from './schema/foreign-key-constraint-builder.js';
export * from './schema/alter-table-builder.js';
export * from './schema/create-view-builder.js';
export * from './schema/refresh-materialized-view-builder.js';
export * from './schema/drop-view-builder.js';
export * from './schema/alter-column-builder.js';
export * from './dynamic/dynamic.js';
export * from './dynamic/dynamic-reference-builder.js';
export * from './dynamic/dynamic-table-builder.js';
export * from './driver/driver.js';
export * from './driver/database-connection.js';
export * from './driver/connection-provider.js';
export * from './driver/default-connection-provider.js';
export * from './driver/single-connection-provider.js';
export * from './driver/dummy-driver.js';
export * from './dialect/dialect.js';
export * from './dialect/dialect-adapter.js';
export * from './dialect/dialect-adapter-base.js';
export * from './dialect/database-introspector.js';
export * from './dialect/sqlite/sqlite-dialect.js';
export * from './dialect/sqlite/sqlite-dialect-config.js';
export * from './dialect/sqlite/sqlite-driver.js';
export * from './dialect/postgres/postgres-query-compiler.js';
export * from './dialect/postgres/postgres-introspector.js';
export * from './dialect/postgres/postgres-adapter.js';
export * from './dialect/mysql/mysql-dialect.js';
export * from './dialect/mysql/mysql-dialect-config.js';
export * from './dialect/mysql/mysql-driver.js';
export * from './dialect/mysql/mysql-query-compiler.js';
export * from './dialect/mysql/mysql-introspector.js';
export * from './dialect/mysql/mysql-adapter.js';
export * from './dialect/postgres/postgres-driver.js';
export * from './dialect/postgres/postgres-dialect-config.js';
export * from './dialect/postgres/postgres-dialect.js';
export * from './dialect/sqlite/sqlite-query-compiler.js';
export * from './dialect/sqlite/sqlite-introspector.js';
export * from './dialect/sqlite/sqlite-adapter.js';
export * from './dialect/mssql/mssql-adapter.js';
export * from './dialect/mssql/mssql-dialect-config.js';
export * from './dialect/mssql/mssql-dialect.js';
export * from './dialect/mssql/mssql-driver.js';
export * from './dialect/mssql/mssql-introspector.js';
export * from './dialect/mssql/mssql-query-compiler.js';
export * from './query-compiler/default-query-compiler.js';
export * from './query-compiler/query-compiler.js';
export * from './migration/migrator.js';
export * from './migration/file-migration-provider.js';
export * from './plugin/kysely-plugin.js';
export * from './plugin/camel-case/camel-case-plugin.js';
export * from './plugin/deduplicate-joins/deduplicate-joins-plugin.js';
export * from './plugin/with-schema/with-schema-plugin.js';
export * from './plugin/parse-json-results/parse-json-results-plugin.js';
export * from './plugin/handle-empty-in-lists/handle-empty-in-lists-plugin.js';
export * from './plugin/handle-empty-in-lists/handle-empty-in-lists.js';
export * from './operation-node/add-column-node.js';
export * from './operation-node/add-constraint-node.js';
export * from './operation-node/add-index-node.js';
export * from './operation-node/aggregate-function-node.js';
export * from './operation-node/alias-node.js';
export * from './operation-node/alter-column-node.js';
export * from './operation-node/alter-table-node.js';
export * from './operation-node/and-node.js';
export * from './operation-node/binary-operation-node.js';
export * from './operation-node/case-node.js';
export * from './operation-node/cast-node.js';
export * from './operation-node/check-constraint-node.js';
export * from './operation-node/collate-node.js';
export * from './operation-node/column-definition-node.js';
export * from './operation-node/column-node.js';
export * from './operation-node/column-update-node.js';
export * from './operation-node/common-table-expression-name-node.js';
export * from './operation-node/common-table-expression-node.js';
export * from './operation-node/constraint-node.js';
export * from './operation-node/create-index-node.js';
export * from './operation-node/create-schema-node.js';
export * from './operation-node/create-table-node.js';
export * from './operation-node/create-type-node.js';
export * from './operation-node/create-view-node.js';
export * from './operation-node/refresh-materialized-view-node.js';
export * from './operation-node/data-type-node.js';
export * from './operation-node/default-insert-value-node.js';
export * from './operation-node/default-value-node.js';
export * from './operation-node/delete-query-node.js';
export * from './operation-node/drop-column-node.js';
export * from './operation-node/drop-constraint-node.js';
export * from './operation-node/drop-index-node.js';
export * from './operation-node/drop-schema-node.js';
export * from './operation-node/drop-table-node.js';
export * from './operation-node/drop-type-node.js';
export * from './operation-node/drop-view-node.js';
export * from './operation-node/explain-node.js';
export * from './operation-node/fetch-node.js';
export * from './operation-node/foreign-key-constraint-node.js';
export * from './operation-node/from-node.js';
export * from './operation-node/function-node.js';
export * from './operation-node/generated-node.js';
export * from './operation-node/group-by-item-node.js';
export * from './operation-node/group-by-node.js';
export * from './operation-node/having-node.js';
export * from './operation-node/identifier-node.js';
export * from './operation-node/insert-query-node.js';
export * from './operation-node/join-node.js';
export * from './operation-node/json-operator-chain-node.js';
export * from './operation-node/json-path-leg-node.js';
export * from './operation-node/json-path-node.js';
export * from './operation-node/json-reference-node.js';
export * from './operation-node/limit-node.js';
export * from './operation-node/list-node.js';
export * from './operation-node/matched-node.js';
export * from './operation-node/merge-query-node.js';
export * from './operation-node/modify-column-node.js';
export * from './operation-node/offset-node.js';
export * from './operation-node/on-conflict-node.js';
export * from './operation-node/on-duplicate-key-node.js';
export * from './operation-node/on-node.js';
export * from './operation-node/operation-node-source.js';
export * from './operation-node/operation-node-transformer.js';
export * from './operation-node/operation-node-visitor.js';
export * from './operation-node/operation-node.js';
export * from './operation-node/operator-node.js';
export * from './operation-node/or-action-node.js';
export * from './operation-node/or-node.js';
export * from './operation-node/order-by-item-node.js';
export * from './operation-node/order-by-node.js';
export * from './operation-node/output-node.js';
export * from './operation-node/over-node.js';
export * from './operation-node/parens-node.js';
export * from './operation-node/partition-by-item-node.js';
export * from './operation-node/partition-by-node.js';
export * from './operation-node/primary-key-constraint-node.js';
export * from './operation-node/primitive-value-list-node.js';
export * from './operation-node/query-node.js';
export * from './operation-node/raw-node.js';
export * from './operation-node/reference-node.js';
export * from './operation-node/references-node.js';
export * from './operation-node/rename-column-node.js';
export * from './operation-node/rename-constraint-node.js';
export * from './operation-node/returning-node.js';
export * from './operation-node/schemable-identifier-node.js';
export * from './operation-node/select-all-node.js';
export * from './operation-node/select-modifier-node.js';
export * from './operation-node/select-query-node.js';
export * from './operation-node/selection-node.js';
export * from './operation-node/set-operation-node.js';
export * from './operation-node/simple-reference-expression-node.js';
export * from './operation-node/table-node.js';
export * from './operation-node/top-node.js';
export * from './operation-node/tuple-node.js';
export * from './operation-node/unary-operation-node.js';
export * from './operation-node/unique-constraint-node.js';
export * from './operation-node/update-query-node.js';
export * from './operation-node/using-node.js';
export * from './operation-node/value-list-node.js';
export * from './operation-node/value-node.js';
export * from './operation-node/values-node.js';
export * from './operation-node/when-node.js';
export * from './operation-node/where-node.js';
export * from './operation-node/with-node.js';
export * from './util/column-type.js';
export * from './util/compilable.js';
export * from './util/explainable.js';
export * from './util/streamable.js';
export * from './util/log.js';
export type { AnyAliasedColumn, AnyAliasedColumnWithTable, AnyColumn, AnyColumnWithTable, DrainOuterGeneric, Equals, ExtractColumnType, UnknownRow, Simplify, SqlBool, Nullable, NumbersWhenDataTypeNotAvailable, NotNull, NumericString, ShallowDehydrateObject, ShallowDehydrateValue, SimplifyResult, SimplifySingleResult, StringsWhenDataTypeNotAvailable, } from './util/type-utils.js';
export * from './util/infer-result.js';
export { logOnce } from './util/log-once.js';
export { createQueryId, type QueryId } from './util/query-id.js';
export type { KyselyTypeError } from './util/type-error.js';
export type { SelectExpression, SelectCallback, SelectArg, Selection, CallbackSelection, } from './parser/select-parser.js';
export type { ReferenceExpression, ReferenceExpressionOrList, SimpleReferenceExpression, StringReference, ExtractTypeFromStringReference, ExtractTypeFromReferenceExpression, } from './parser/reference-parser.js';
export type { ValueExpression, ValueExpressionOrList, } from './parser/value-parser.js';
export type { SimpleTableReference, TableExpression, TableExpressionOrList, } from './parser/table-parser.js';
export type { JoinReferenceExpression, JoinCallbackExpression, } from './parser/join-parser.js';
export type { InsertObject } from './parser/insert-values-parser.js';
export type { UpdateObject } from './parser/update-set-parser.js';
export type { OrderByExpression, OrderByDirectionExpression, OrderByModifiers, OrderByDirection, OrderByModifiersCallbackExpression, } from './parser/order-by-parser.js';
export type { ComparisonOperatorExpression, OperandValueExpression, OperandValueExpressionOrList, FilterObject, } from './parser/binary-operation-parser.js';
export type { ExistsExpression } from './parser/unary-operation-parser.js';
export type { OperandExpression, ExpressionOrFactory, } from './parser/expression-parser.js';
export type { Collation } from './parser/collate-parser.js';
export type { QueryCreatorWithCommonTableExpression } from './parser/with-parser.js';

231
node_modules/kysely/dist/cjs/index.js generated vendored Normal file
View File

@@ -0,0 +1,231 @@
"use strict";
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
var desc = Object.getOwnPropertyDescriptor(m, k);
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
desc = { enumerable: true, get: function() { return m[k]; } };
}
Object.defineProperty(o, k2, desc);
}) : (function(o, m, k, k2) {
if (k2 === undefined) k2 = k;
o[k2] = m[k];
}));
var __exportStar = (this && this.__exportStar) || function(m, exports) {
for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
};
Object.defineProperty(exports, "__esModule", { value: true });
exports.createQueryId = exports.logOnce = exports.expressionBuilder = void 0;
__exportStar(require("./kysely.js"), exports);
__exportStar(require("./query-creator.js"), exports);
__exportStar(require("./expression/expression.js"), exports);
var expression_builder_js_1 = require("./expression/expression-builder.js");
Object.defineProperty(exports, "expressionBuilder", { enumerable: true, get: function () { return expression_builder_js_1.expressionBuilder; } });
__exportStar(require("./expression/expression-wrapper.js"), exports);
__exportStar(require("./query-builder/where-interface.js"), exports);
__exportStar(require("./query-builder/returning-interface.js"), exports);
__exportStar(require("./query-builder/output-interface.js"), exports);
__exportStar(require("./query-builder/having-interface.js"), exports);
__exportStar(require("./query-builder/order-by-interface.js"), exports);
__exportStar(require("./query-builder/select-query-builder.js"), exports);
__exportStar(require("./query-builder/insert-query-builder.js"), exports);
__exportStar(require("./query-builder/update-query-builder.js"), exports);
__exportStar(require("./query-builder/delete-query-builder.js"), exports);
__exportStar(require("./query-builder/no-result-error.js"), exports);
__exportStar(require("./query-builder/join-builder.js"), exports);
__exportStar(require("./query-builder/function-module.js"), exports);
__exportStar(require("./query-builder/insert-result.js"), exports);
__exportStar(require("./query-builder/delete-result.js"), exports);
__exportStar(require("./query-builder/update-result.js"), exports);
__exportStar(require("./query-builder/on-conflict-builder.js"), exports);
__exportStar(require("./query-builder/aggregate-function-builder.js"), exports);
__exportStar(require("./query-builder/case-builder.js"), exports);
__exportStar(require("./query-builder/json-path-builder.js"), exports);
__exportStar(require("./query-builder/merge-query-builder.js"), exports);
__exportStar(require("./query-builder/merge-result.js"), exports);
__exportStar(require("./query-builder/order-by-item-builder.js"), exports);
__exportStar(require("./raw-builder/raw-builder.js"), exports);
__exportStar(require("./raw-builder/sql.js"), exports);
__exportStar(require("./query-executor/query-executor.js"), exports);
__exportStar(require("./query-executor/default-query-executor.js"), exports);
__exportStar(require("./query-executor/noop-query-executor.js"), exports);
__exportStar(require("./query-executor/query-executor-provider.js"), exports);
__exportStar(require("./query-compiler/default-query-compiler.js"), exports);
__exportStar(require("./query-compiler/compiled-query.js"), exports);
__exportStar(require("./schema/schema.js"), exports);
__exportStar(require("./schema/create-table-builder.js"), exports);
__exportStar(require("./schema/create-type-builder.js"), exports);
__exportStar(require("./schema/drop-table-builder.js"), exports);
__exportStar(require("./schema/drop-type-builder.js"), exports);
__exportStar(require("./schema/create-index-builder.js"), exports);
__exportStar(require("./schema/drop-index-builder.js"), exports);
__exportStar(require("./schema/create-schema-builder.js"), exports);
__exportStar(require("./schema/drop-schema-builder.js"), exports);
__exportStar(require("./schema/column-definition-builder.js"), exports);
__exportStar(require("./schema/foreign-key-constraint-builder.js"), exports);
__exportStar(require("./schema/alter-table-builder.js"), exports);
__exportStar(require("./schema/create-view-builder.js"), exports);
__exportStar(require("./schema/refresh-materialized-view-builder.js"), exports);
__exportStar(require("./schema/drop-view-builder.js"), exports);
__exportStar(require("./schema/alter-column-builder.js"), exports);
__exportStar(require("./dynamic/dynamic.js"), exports);
__exportStar(require("./dynamic/dynamic-reference-builder.js"), exports);
__exportStar(require("./dynamic/dynamic-table-builder.js"), exports);
__exportStar(require("./driver/driver.js"), exports);
__exportStar(require("./driver/database-connection.js"), exports);
__exportStar(require("./driver/connection-provider.js"), exports);
__exportStar(require("./driver/default-connection-provider.js"), exports);
__exportStar(require("./driver/single-connection-provider.js"), exports);
__exportStar(require("./driver/dummy-driver.js"), exports);
__exportStar(require("./dialect/dialect.js"), exports);
__exportStar(require("./dialect/dialect-adapter.js"), exports);
__exportStar(require("./dialect/dialect-adapter-base.js"), exports);
__exportStar(require("./dialect/database-introspector.js"), exports);
__exportStar(require("./dialect/sqlite/sqlite-dialect.js"), exports);
__exportStar(require("./dialect/sqlite/sqlite-dialect-config.js"), exports);
__exportStar(require("./dialect/sqlite/sqlite-driver.js"), exports);
__exportStar(require("./dialect/postgres/postgres-query-compiler.js"), exports);
__exportStar(require("./dialect/postgres/postgres-introspector.js"), exports);
__exportStar(require("./dialect/postgres/postgres-adapter.js"), exports);
__exportStar(require("./dialect/mysql/mysql-dialect.js"), exports);
__exportStar(require("./dialect/mysql/mysql-dialect-config.js"), exports);
__exportStar(require("./dialect/mysql/mysql-driver.js"), exports);
__exportStar(require("./dialect/mysql/mysql-query-compiler.js"), exports);
__exportStar(require("./dialect/mysql/mysql-introspector.js"), exports);
__exportStar(require("./dialect/mysql/mysql-adapter.js"), exports);
__exportStar(require("./dialect/postgres/postgres-driver.js"), exports);
__exportStar(require("./dialect/postgres/postgres-dialect-config.js"), exports);
__exportStar(require("./dialect/postgres/postgres-dialect.js"), exports);
__exportStar(require("./dialect/sqlite/sqlite-query-compiler.js"), exports);
__exportStar(require("./dialect/sqlite/sqlite-introspector.js"), exports);
__exportStar(require("./dialect/sqlite/sqlite-adapter.js"), exports);
__exportStar(require("./dialect/mssql/mssql-adapter.js"), exports);
__exportStar(require("./dialect/mssql/mssql-dialect-config.js"), exports);
__exportStar(require("./dialect/mssql/mssql-dialect.js"), exports);
__exportStar(require("./dialect/mssql/mssql-driver.js"), exports);
__exportStar(require("./dialect/mssql/mssql-introspector.js"), exports);
__exportStar(require("./dialect/mssql/mssql-query-compiler.js"), exports);
__exportStar(require("./query-compiler/default-query-compiler.js"), exports);
__exportStar(require("./query-compiler/query-compiler.js"), exports);
__exportStar(require("./migration/migrator.js"), exports);
__exportStar(require("./migration/file-migration-provider.js"), exports);
__exportStar(require("./plugin/kysely-plugin.js"), exports);
__exportStar(require("./plugin/camel-case/camel-case-plugin.js"), exports);
__exportStar(require("./plugin/deduplicate-joins/deduplicate-joins-plugin.js"), exports);
__exportStar(require("./plugin/with-schema/with-schema-plugin.js"), exports);
__exportStar(require("./plugin/parse-json-results/parse-json-results-plugin.js"), exports);
__exportStar(require("./plugin/handle-empty-in-lists/handle-empty-in-lists-plugin.js"), exports);
__exportStar(require("./plugin/handle-empty-in-lists/handle-empty-in-lists.js"), exports);
__exportStar(require("./operation-node/add-column-node.js"), exports);
__exportStar(require("./operation-node/add-constraint-node.js"), exports);
__exportStar(require("./operation-node/add-index-node.js"), exports);
__exportStar(require("./operation-node/aggregate-function-node.js"), exports);
__exportStar(require("./operation-node/alias-node.js"), exports);
__exportStar(require("./operation-node/alter-column-node.js"), exports);
__exportStar(require("./operation-node/alter-table-node.js"), exports);
__exportStar(require("./operation-node/and-node.js"), exports);
__exportStar(require("./operation-node/binary-operation-node.js"), exports);
__exportStar(require("./operation-node/case-node.js"), exports);
__exportStar(require("./operation-node/cast-node.js"), exports);
__exportStar(require("./operation-node/check-constraint-node.js"), exports);
__exportStar(require("./operation-node/collate-node.js"), exports);
__exportStar(require("./operation-node/column-definition-node.js"), exports);
__exportStar(require("./operation-node/column-node.js"), exports);
__exportStar(require("./operation-node/column-update-node.js"), exports);
__exportStar(require("./operation-node/common-table-expression-name-node.js"), exports);
__exportStar(require("./operation-node/common-table-expression-node.js"), exports);
__exportStar(require("./operation-node/constraint-node.js"), exports);
__exportStar(require("./operation-node/create-index-node.js"), exports);
__exportStar(require("./operation-node/create-schema-node.js"), exports);
__exportStar(require("./operation-node/create-table-node.js"), exports);
__exportStar(require("./operation-node/create-type-node.js"), exports);
__exportStar(require("./operation-node/create-view-node.js"), exports);
__exportStar(require("./operation-node/refresh-materialized-view-node.js"), exports);
__exportStar(require("./operation-node/data-type-node.js"), exports);
__exportStar(require("./operation-node/default-insert-value-node.js"), exports);
__exportStar(require("./operation-node/default-value-node.js"), exports);
__exportStar(require("./operation-node/delete-query-node.js"), exports);
__exportStar(require("./operation-node/drop-column-node.js"), exports);
__exportStar(require("./operation-node/drop-constraint-node.js"), exports);
__exportStar(require("./operation-node/drop-index-node.js"), exports);
__exportStar(require("./operation-node/drop-schema-node.js"), exports);
__exportStar(require("./operation-node/drop-table-node.js"), exports);
__exportStar(require("./operation-node/drop-type-node.js"), exports);
__exportStar(require("./operation-node/drop-view-node.js"), exports);
__exportStar(require("./operation-node/explain-node.js"), exports);
__exportStar(require("./operation-node/fetch-node.js"), exports);
__exportStar(require("./operation-node/foreign-key-constraint-node.js"), exports);
__exportStar(require("./operation-node/from-node.js"), exports);
__exportStar(require("./operation-node/function-node.js"), exports);
__exportStar(require("./operation-node/generated-node.js"), exports);
__exportStar(require("./operation-node/group-by-item-node.js"), exports);
__exportStar(require("./operation-node/group-by-node.js"), exports);
__exportStar(require("./operation-node/having-node.js"), exports);
__exportStar(require("./operation-node/identifier-node.js"), exports);
__exportStar(require("./operation-node/insert-query-node.js"), exports);
__exportStar(require("./operation-node/join-node.js"), exports);
__exportStar(require("./operation-node/json-operator-chain-node.js"), exports);
__exportStar(require("./operation-node/json-path-leg-node.js"), exports);
__exportStar(require("./operation-node/json-path-node.js"), exports);
__exportStar(require("./operation-node/json-reference-node.js"), exports);
__exportStar(require("./operation-node/limit-node.js"), exports);
__exportStar(require("./operation-node/list-node.js"), exports);
__exportStar(require("./operation-node/matched-node.js"), exports);
__exportStar(require("./operation-node/merge-query-node.js"), exports);
__exportStar(require("./operation-node/modify-column-node.js"), exports);
__exportStar(require("./operation-node/offset-node.js"), exports);
__exportStar(require("./operation-node/on-conflict-node.js"), exports);
__exportStar(require("./operation-node/on-duplicate-key-node.js"), exports);
__exportStar(require("./operation-node/on-node.js"), exports);
__exportStar(require("./operation-node/operation-node-source.js"), exports);
__exportStar(require("./operation-node/operation-node-transformer.js"), exports);
__exportStar(require("./operation-node/operation-node-visitor.js"), exports);
__exportStar(require("./operation-node/operation-node.js"), exports);
__exportStar(require("./operation-node/operator-node.js"), exports);
__exportStar(require("./operation-node/or-action-node.js"), exports);
__exportStar(require("./operation-node/or-node.js"), exports);
__exportStar(require("./operation-node/order-by-item-node.js"), exports);
__exportStar(require("./operation-node/order-by-node.js"), exports);
__exportStar(require("./operation-node/output-node.js"), exports);
__exportStar(require("./operation-node/over-node.js"), exports);
__exportStar(require("./operation-node/parens-node.js"), exports);
__exportStar(require("./operation-node/partition-by-item-node.js"), exports);
__exportStar(require("./operation-node/partition-by-node.js"), exports);
__exportStar(require("./operation-node/primary-key-constraint-node.js"), exports);
__exportStar(require("./operation-node/primitive-value-list-node.js"), exports);
__exportStar(require("./operation-node/query-node.js"), exports);
__exportStar(require("./operation-node/raw-node.js"), exports);
__exportStar(require("./operation-node/reference-node.js"), exports);
__exportStar(require("./operation-node/references-node.js"), exports);
__exportStar(require("./operation-node/rename-column-node.js"), exports);
__exportStar(require("./operation-node/rename-constraint-node.js"), exports);
__exportStar(require("./operation-node/returning-node.js"), exports);
__exportStar(require("./operation-node/schemable-identifier-node.js"), exports);
__exportStar(require("./operation-node/select-all-node.js"), exports);
__exportStar(require("./operation-node/select-modifier-node.js"), exports);
__exportStar(require("./operation-node/select-query-node.js"), exports);
__exportStar(require("./operation-node/selection-node.js"), exports);
__exportStar(require("./operation-node/set-operation-node.js"), exports);
__exportStar(require("./operation-node/simple-reference-expression-node.js"), exports);
__exportStar(require("./operation-node/table-node.js"), exports);
__exportStar(require("./operation-node/top-node.js"), exports);
__exportStar(require("./operation-node/tuple-node.js"), exports);
__exportStar(require("./operation-node/unary-operation-node.js"), exports);
__exportStar(require("./operation-node/unique-constraint-node.js"), exports);
__exportStar(require("./operation-node/update-query-node.js"), exports);
__exportStar(require("./operation-node/using-node.js"), exports);
__exportStar(require("./operation-node/value-list-node.js"), exports);
__exportStar(require("./operation-node/value-node.js"), exports);
__exportStar(require("./operation-node/values-node.js"), exports);
__exportStar(require("./operation-node/when-node.js"), exports);
__exportStar(require("./operation-node/where-node.js"), exports);
__exportStar(require("./operation-node/with-node.js"), exports);
__exportStar(require("./util/column-type.js"), exports);
__exportStar(require("./util/compilable.js"), exports);
__exportStar(require("./util/explainable.js"), exports);
__exportStar(require("./util/streamable.js"), exports);
__exportStar(require("./util/log.js"), exports);
__exportStar(require("./util/infer-result.js"), exports);
var log_once_js_1 = require("./util/log-once.js");
Object.defineProperty(exports, "logOnce", { enumerable: true, get: function () { return log_once_js_1.logOnce; } });
var query_id_js_1 = require("./util/query-id.js");
Object.defineProperty(exports, "createQueryId", { enumerable: true, get: function () { return query_id_js_1.createQueryId; } });

848
node_modules/kysely/dist/cjs/kysely.d.ts generated vendored Normal file
View File

@@ -0,0 +1,848 @@
import type { Dialect } from './dialect/dialect.js';
import { SchemaModule } from './schema/schema.js';
import { DynamicModule } from './dynamic/dynamic.js';
import type { QueryExecutor } from './query-executor/query-executor.js';
import { QueryCreator } from './query-creator.js';
import type { KyselyPlugin } from './plugin/kysely-plugin.js';
import type { DatabaseIntrospector } from './dialect/database-introspector.js';
import { type Driver, type IsolationLevel, type AccessMode } from './driver/driver.js';
import { type FunctionModule } from './query-builder/function-module.js';
import { type LogConfig } from './util/log.js';
import type { QueryExecutorProvider } from './query-executor/query-executor-provider.js';
import type { QueryResult } from './driver/database-connection.js';
import type { CompiledQuery } from './query-compiler/compiled-query.js';
import { type QueryId } from './util/query-id.js';
import { type Compilable } from './util/compilable.js';
import { CaseBuilder } from './query-builder/case-builder.js';
import type { Expression } from './expression/expression.js';
import type { DrainOuterGeneric } from './util/type-utils.js';
import type { ReleaseSavepoint, RollbackToSavepoint } from './parser/savepoint-parser.js';
import { type ControlledConnection } from './util/provide-controlled-connection.js';
declare global {
interface AsyncDisposable {
}
interface SymbolConstructor {
readonly asyncDispose: unique symbol;
}
}
/**
* The main Kysely class.
*
* You should create one instance of `Kysely` per database using the {@link Kysely}
* constructor. Each `Kysely` instance maintains its own connection pool.
*
* ### Examples
*
* This example assumes your database has a "person" table:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { type Generated, Kysely, SqliteDialect } from 'kysely'
*
* interface Database {
* person: {
* id: Generated<number>
* first_name: string
* last_name: string | null
* }
* }
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:'),
* })
* })
* ```
*
* @typeParam DB - The database interface type. Keys of this type must be table names
* in the database and values must be interfaces that describe the rows in those
* tables. See the examples above.
*/
export declare class Kysely<DB> extends QueryCreator<DB> implements QueryExecutorProvider, AsyncDisposable {
#private;
constructor(args: KyselyConfig);
constructor(args: KyselyProps);
/**
* Returns the {@link SchemaModule} module for building database schema.
*/
get schema(): SchemaModule;
/**
* Returns a the {@link DynamicModule} module.
*
* The {@link DynamicModule} module can be used to bypass strict typing and
* passing in dynamic values for the queries.
*/
get dynamic(): DynamicModule<DB>;
/**
* Returns a {@link DatabaseIntrospector | database introspector}.
*/
get introspection(): DatabaseIntrospector;
/**
* Creates a `case` statement/operator.
*
* See {@link ExpressionBuilder.case} for more information.
*/
case(): CaseBuilder<DB, keyof DB>;
case<V>(value: Expression<V>): CaseBuilder<DB, keyof DB, V>;
/**
* Returns a {@link FunctionModule} that can be used to write somewhat type-safe function
* calls.
*
* ```ts
* const { count } = db.fn
*
* await db.selectFrom('person')
* .innerJoin('pet', 'pet.owner_id', 'person.id')
* .select([
* 'id',
* count('pet.id').as('person_count'),
* ])
* .groupBy('person.id')
* .having(count('pet.id'), '>', 10)
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person"."id", count("pet"."id") as "person_count"
* from "person"
* inner join "pet" on "pet"."owner_id" = "person"."id"
* group by "person"."id"
* having count("pet"."id") > $1
* ```
*
* Why "somewhat" type-safe? Because the function calls are not bound to the
* current query context. They allow you to reference columns and tables that
* are not in the current query. E.g. remove the `innerJoin` from the previous
* query and TypeScript won't even complain.
*
* If you want to make the function calls fully type-safe, you can use the
* {@link ExpressionBuilder.fn} getter for a query context-aware, stricter {@link FunctionModule}.
*
* ```ts
* await db.selectFrom('person')
* .innerJoin('pet', 'pet.owner_id', 'person.id')
* .select((eb) => [
* 'person.id',
* eb.fn.count('pet.id').as('pet_count')
* ])
* .groupBy('person.id')
* .having((eb) => eb.fn.count('pet.id'), '>', 10)
* .execute()
* ```
*/
get fn(): FunctionModule<DB, keyof DB>;
/**
* Creates a {@link TransactionBuilder} that can be used to run queries inside a transaction.
*
* The returned {@link TransactionBuilder} can be used to configure the transaction. The
* {@link TransactionBuilder.execute} method can then be called to run the transaction.
* {@link TransactionBuilder.execute} takes a function that is run inside the
* transaction. If the function throws an exception,
* 1. the exception is caught,
* 2. the transaction is rolled back, and
* 3. the exception is thrown again.
* Otherwise the transaction is committed.
*
* The callback function passed to the {@link TransactionBuilder.execute | execute}
* method gets the transaction object as its only argument. The transaction is
* of type {@link Transaction} which inherits {@link Kysely}. Any query
* started through the transaction object is executed inside the transaction.
*
* To run a controlled transaction, allowing you to commit and rollback manually,
* use {@link startTransaction} instead.
*
* ### Examples
*
* <!-- siteExample("transactions", "Simple transaction", 10) -->
*
* This example inserts two rows in a transaction. If an exception is thrown inside
* the callback passed to the `execute` method,
* 1. the exception is caught,
* 2. the transaction is rolled back, and
* 3. the exception is thrown again.
* Otherwise the transaction is committed.
*
* ```ts
* const catto = await db.transaction().execute(async (trx) => {
* const jennifer = await trx.insertInto('person')
* .values({
* first_name: 'Jennifer',
* last_name: 'Aniston',
* age: 40,
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* return await trx.insertInto('pet')
* .values({
* owner_id: jennifer.id,
* name: 'Catto',
* species: 'cat',
* is_favorite: false,
* })
* .returningAll()
* .executeTakeFirst()
* })
* ```
*
* Setting the isolation level:
*
* ```ts
* import type { Kysely } from 'kysely'
*
* await db
* .transaction()
* .setIsolationLevel('serializable')
* .execute(async (trx) => {
* await doStuff(trx)
* })
*
* async function doStuff(kysely: typeof db) {
* // ...
* }
* ```
*/
transaction(): TransactionBuilder<DB>;
/**
* Creates a {@link ControlledTransactionBuilder} that can be used to run queries inside a controlled transaction.
*
* The returned {@link ControlledTransactionBuilder} can be used to configure the transaction.
* The {@link ControlledTransactionBuilder.execute} method can then be called
* to start the transaction and return a {@link ControlledTransaction}.
*
* A {@link ControlledTransaction} allows you to commit and rollback manually,
* execute savepoint commands. It extends {@link Transaction} which extends {@link Kysely},
* so you can run queries inside the transaction. Once the transaction is committed,
* or rolled back, it can't be used anymore - all queries will throw an error.
* This is to prevent accidentally running queries outside the transaction - where
* atomicity is not guaranteed anymore.
*
* ### Examples
*
* <!-- siteExample("transactions", "Controlled transaction", 11) -->
*
* A controlled transaction allows you to commit and rollback manually, execute
* savepoint commands, and queries in general.
*
* In this example we start a transaction, use it to insert two rows and then commit
* the transaction. If an error is thrown, we catch it and rollback the transaction.
*
* ```ts
* const trx = await db.startTransaction().execute()
*
* try {
* const jennifer = await trx.insertInto('person')
* .values({
* first_name: 'Jennifer',
* last_name: 'Aniston',
* age: 40,
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* const catto = await trx.insertInto('pet')
* .values({
* owner_id: jennifer.id,
* name: 'Catto',
* species: 'cat',
* is_favorite: false,
* })
* .returningAll()
* .executeTakeFirstOrThrow()
*
* await trx.commit().execute()
*
* // ...
* } catch (error) {
* await trx.rollback().execute()
* }
* ```
*
* <!-- siteExample("transactions", "Controlled transaction /w savepoints", 12) -->
*
* A controlled transaction allows you to commit and rollback manually, execute
* savepoint commands, and queries in general.
*
* In this example we start a transaction, insert a person, create a savepoint,
* try inserting a toy and a pet, and if an error is thrown, we rollback to the
* savepoint. Eventually we release the savepoint, insert an audit record and
* commit the transaction. If an error is thrown, we catch it and rollback the
* transaction.
*
* ```ts
* const trx = await db.startTransaction().execute()
*
* try {
* const jennifer = await trx
* .insertInto('person')
* .values({
* first_name: 'Jennifer',
* last_name: 'Aniston',
* age: 40,
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* const catto = await trxAfterJennifer
* .insertInto('pet')
* .values({
* owner_id: jennifer.id,
* name: 'Catto',
* species: 'cat',
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* await trxAfterJennifer
* .insertInto('toy')
* .values({ name: 'Bone', price: 1.99, pet_id: catto.id })
* .execute()
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* await trxAfterJennifer.releaseSavepoint('after_jennifer').execute()
*
* await trx.insertInto('audit').values({ action: 'added Jennifer' }).execute()
*
* await trx.commit().execute()
* } catch (error) {
* await trx.rollback().execute()
* }
* ```
*/
startTransaction(): ControlledTransactionBuilder<DB>;
/**
* Provides a kysely instance bound to a single database connection.
*
* ### Examples
*
* ```ts
* await db
* .connection()
* .execute(async (db) => {
* // `db` is an instance of `Kysely` that's bound to a single
* // database connection. All queries executed through `db` use
* // the same connection.
* await doStuff(db)
* })
*
* async function doStuff(kysely: typeof db) {
* // ...
* }
* ```
*/
connection(): ConnectionBuilder<DB>;
/**
* Returns a copy of this Kysely instance with the given plugin installed.
*/
withPlugin(plugin: KyselyPlugin): Kysely<DB>;
/**
* Returns a copy of this Kysely instance without any plugins.
*/
withoutPlugins(): Kysely<DB>;
/**
* @override
*/
withSchema(schema: string): Kysely<DB>;
/**
* Returns a copy of this Kysely instance with tables added to its
* database type.
*
* This method only modifies the types and doesn't affect any of the
* executed queries in any way.
*
* ### Examples
*
* The following example adds and uses a temporary table:
*
* ```ts
* await db.schema
* .createTable('temp_table')
* .temporary()
* .addColumn('some_column', 'integer')
* .execute()
*
* const tempDb = db.withTables<{
* temp_table: {
* some_column: number
* }
* }>()
*
* await tempDb
* .insertInto('temp_table')
* .values({ some_column: 100 })
* .execute()
* ```
*/
withTables<T extends Record<string, Record<string, any>>>(): Kysely<DrainOuterGeneric<DB & T>>;
/**
* Releases all resources and disconnects from the database.
*
* You need to call this when you are done using the `Kysely` instance.
*/
destroy(): Promise<void>;
/**
* Returns true if this `Kysely` instance is a transaction.
*
* You can also use `db instanceof Transaction`.
*/
get isTransaction(): boolean;
/**
* @internal
* @private
*/
getExecutor(): QueryExecutor;
/**
* Executes a given compiled query or query builder.
*
* See {@link https://github.com/kysely-org/kysely/blob/master/site/docs/recipes/0004-splitting-query-building-and-execution.md#execute-compiled-queries splitting build, compile and execute code recipe} for more information.
*/
executeQuery<R>(query: CompiledQuery<R> | Compilable<R>, queryId?: QueryId): Promise<QueryResult<R>>;
[Symbol.asyncDispose](): Promise<void>;
}
export declare class Transaction<DB> extends Kysely<DB> {
#private;
constructor(props: KyselyProps);
/**
* Returns true if this `Kysely` instance is a transaction.
*
* You can also use `db instanceof Transaction`.
*/
get isTransaction(): true;
/**
* Creates a {@link TransactionBuilder} that can be used to run queries inside a transaction.
*
* The returned {@link TransactionBuilder} can be used to configure the transaction. The
* {@link TransactionBuilder.execute} method can then be called to run the transaction.
* {@link TransactionBuilder.execute} takes a function that is run inside the
* transaction. If the function throws an exception,
* 1. the exception is caught,
* 2. the transaction is rolled back, and
* 3. the exception is thrown again.
* Otherwise the transaction is committed.
*
* The callback function passed to the {@link TransactionBuilder.execute | execute}
* method gets the transaction object as its only argument. The transaction is
* of type {@link Transaction} which inherits {@link Kysely}. Any query
* started through the transaction object is executed inside the transaction.
*
* To run a controlled transaction, allowing you to commit and rollback manually,
* use {@link startTransaction} instead.
*
* ### Examples
*
* <!-- siteExample("transactions", "Simple transaction", 10) -->
*
* This example inserts two rows in a transaction. If an exception is thrown inside
* the callback passed to the `execute` method,
* 1. the exception is caught,
* 2. the transaction is rolled back, and
* 3. the exception is thrown again.
* Otherwise the transaction is committed.
*
* ```ts
* const catto = await db.transaction().execute(async (trx) => {
* const jennifer = await trx.insertInto('person')
* .values({
* first_name: 'Jennifer',
* last_name: 'Aniston',
* age: 40,
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* return await trx.insertInto('pet')
* .values({
* owner_id: jennifer.id,
* name: 'Catto',
* species: 'cat',
* is_favorite: false,
* })
* .returningAll()
* .executeTakeFirst()
* })
* ```
*
* Setting the isolation level:
*
* ```ts
* import type { Kysely } from 'kysely'
*
* await db
* .transaction()
* .setIsolationLevel('serializable')
* .execute(async (trx) => {
* await doStuff(trx)
* })
*
* async function doStuff(kysely: typeof db) {
* // ...
* }
* ```
*/
transaction(): TransactionBuilder<DB>;
/**
* Provides a kysely instance bound to a single database connection.
*
* ### Examples
*
* ```ts
* await db
* .connection()
* .execute(async (db) => {
* // `db` is an instance of `Kysely` that's bound to a single
* // database connection. All queries executed through `db` use
* // the same connection.
* await doStuff(db)
* })
*
* async function doStuff(kysely: typeof db) {
* // ...
* }
* ```
*/
connection(): ConnectionBuilder<DB>;
/**
* Releases all resources and disconnects from the database.
*
* You need to call this when you are done using the `Kysely` instance.
*/
destroy(): Promise<void>;
/**
* Returns a copy of this Kysely instance with the given plugin installed.
*/
withPlugin(plugin: KyselyPlugin): Transaction<DB>;
/**
* Returns a copy of this Kysely instance without any plugins.
*/
withoutPlugins(): Transaction<DB>;
/**
* @override
*/
withSchema(schema: string): Transaction<DB>;
/**
* Returns a copy of this Kysely instance with tables added to its
* database type.
*
* This method only modifies the types and doesn't affect any of the
* executed queries in any way.
*
* ### Examples
*
* The following example adds and uses a temporary table:
*
* ```ts
* await db.schema
* .createTable('temp_table')
* .temporary()
* .addColumn('some_column', 'integer')
* .execute()
*
* const tempDb = db.withTables<{
* temp_table: {
* some_column: number
* }
* }>()
*
* await tempDb
* .insertInto('temp_table')
* .values({ some_column: 100 })
* .execute()
* ```
*/
withTables<T extends Record<string, Record<string, any>>>(): Transaction<DrainOuterGeneric<DB & T>>;
}
export interface KyselyProps {
readonly config: KyselyConfig;
readonly driver: Driver;
readonly executor: QueryExecutor;
readonly dialect: Dialect;
}
export declare function isKyselyProps(obj: unknown): obj is KyselyProps;
export interface KyselyConfig {
readonly dialect: Dialect;
readonly plugins?: KyselyPlugin[];
/**
* A list of log levels to log or a custom logger function.
*
* Currently there's only two levels: `query` and `error`.
* This will be expanded based on user feedback later.
*
* ### Examples
*
* Setting up built-in logging for preferred log levels:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:'),
* }),
* log: ['query', 'error']
* })
* ```
*
* Setting up custom logging:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { Kysely, SqliteDialect } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:'),
* }),
* log(event): void {
* if (event.level === 'query') {
* console.log(event.query.sql)
* console.log(event.query.parameters)
* }
* }
* })
* ```
*/
readonly log?: LogConfig;
}
export declare class ConnectionBuilder<DB> {
#private;
constructor(props: ConnectionBuilderProps);
execute<T>(callback: (db: Kysely<DB>) => Promise<T>): Promise<T>;
}
interface ConnectionBuilderProps extends KyselyProps {
}
export declare class TransactionBuilder<DB> {
#private;
constructor(props: TransactionBuilderProps);
setAccessMode(accessMode: AccessMode): TransactionBuilder<DB>;
setIsolationLevel(isolationLevel: IsolationLevel): TransactionBuilder<DB>;
execute<T>(callback: (trx: Transaction<DB>) => Promise<T>): Promise<T>;
}
interface TransactionBuilderProps extends KyselyProps {
readonly accessMode?: AccessMode;
readonly isolationLevel?: IsolationLevel;
}
export declare class ControlledTransactionBuilder<DB> {
#private;
constructor(props: TransactionBuilderProps);
setAccessMode(accessMode: AccessMode): ControlledTransactionBuilder<DB>;
setIsolationLevel(isolationLevel: IsolationLevel): ControlledTransactionBuilder<DB>;
execute(): Promise<ControlledTransaction<DB>>;
}
export declare class ControlledTransaction<DB, S extends string[] = []> extends Transaction<DB> {
#private;
constructor(props: ControlledTransactionProps);
get isCommitted(): boolean;
get isRolledBack(): boolean;
/**
* Commits the transaction.
*
* See {@link rollback}.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* try {
* await doSomething(trx)
*
* await trx.commit().execute()
* } catch (error) {
* await trx.rollback().execute()
* }
*
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
commit(): Command<void>;
/**
* Rolls back the transaction.
*
* See {@link commit} and {@link rollbackToSavepoint}.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* try {
* await doSomething(trx)
*
* await trx.commit().execute()
* } catch (error) {
* await trx.rollback().execute()
* }
*
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
rollback(): Command<void>;
/**
* Creates a savepoint with a given name.
*
* See {@link rollbackToSavepoint} and {@link releaseSavepoint}.
*
* For a type-safe experience, you should use the returned instance from now on.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* await insertJennifer(trx)
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* await doSomething(trxAfterJennifer)
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* async function insertJennifer(kysely: Kysely<Database>) {}
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
savepoint<SN extends string>(savepointName: SN extends S ? never : SN): Command<ControlledTransaction<DB, [...S, SN]>>;
/**
* Rolls back to a savepoint with a given name.
*
* See {@link savepoint} and {@link releaseSavepoint}.
*
* You must use the same instance returned by {@link savepoint}, or
* escape the type-check by using `as any`.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* await insertJennifer(trx)
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* await doSomething(trxAfterJennifer)
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* async function insertJennifer(kysely: Kysely<Database>) {}
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
rollbackToSavepoint<SN extends S[number]>(savepointName: SN): RollbackToSavepoint<S, SN> extends string[] ? Command<ControlledTransaction<DB, RollbackToSavepoint<S, SN>>> : never;
/**
* Releases a savepoint with a given name.
*
* See {@link savepoint} and {@link rollbackToSavepoint}.
*
* You must use the same instance returned by {@link savepoint}, or
* escape the type-check by using `as any`.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* await insertJennifer(trx)
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* await doSomething(trxAfterJennifer)
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* await trxAfterJennifer.releaseSavepoint('after_jennifer').execute()
*
* await doSomethingElse(trx)
*
* async function insertJennifer(kysely: Kysely<Database>) {}
* async function doSomething(kysely: Kysely<Database>) {}
* async function doSomethingElse(kysely: Kysely<Database>) {}
* ```
*/
releaseSavepoint<SN extends S[number]>(savepointName: SN): ReleaseSavepoint<S, SN> extends string[] ? Command<ControlledTransaction<DB, ReleaseSavepoint<S, SN>>> : never;
/**
* Returns a copy of this Kysely instance with the given plugin installed.
*/
withPlugin(plugin: KyselyPlugin): ControlledTransaction<DB, S>;
/**
* Returns a copy of this Kysely instance without any plugins.
*/
withoutPlugins(): ControlledTransaction<DB, S>;
/**
* @override
*/
withSchema(schema: string): ControlledTransaction<DB, S>;
/**
* Returns a copy of this Kysely instance with tables added to its
* database type.
*
* This method only modifies the types and doesn't affect any of the
* executed queries in any way.
*
* ### Examples
*
* The following example adds and uses a temporary table:
*
* ```ts
* await db.schema
* .createTable('temp_table')
* .temporary()
* .addColumn('some_column', 'integer')
* .execute()
*
* const tempDb = db.withTables<{
* temp_table: {
* some_column: number
* }
* }>()
*
* await tempDb
* .insertInto('temp_table')
* .values({ some_column: 100 })
* .execute()
* ```
*/
withTables<T extends Record<string, Record<string, any>>>(): ControlledTransaction<DrainOuterGeneric<DB & T>, S>;
}
interface ControlledTransactionProps extends KyselyProps {
readonly connection: ControlledConnection;
}
export declare class Command<T> {
#private;
constructor(cb: () => Promise<T>);
/**
* Executes the command.
*/
execute(): Promise<T>;
}
export {};

927
node_modules/kysely/dist/cjs/kysely.js generated vendored Normal file
View File

@@ -0,0 +1,927 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.Command = exports.ControlledTransaction = exports.ControlledTransactionBuilder = exports.TransactionBuilder = exports.ConnectionBuilder = exports.Transaction = exports.Kysely = void 0;
exports.isKyselyProps = isKyselyProps;
const schema_js_1 = require("./schema/schema.js");
const dynamic_js_1 = require("./dynamic/dynamic.js");
const default_connection_provider_js_1 = require("./driver/default-connection-provider.js");
const query_creator_js_1 = require("./query-creator.js");
const default_query_executor_js_1 = require("./query-executor/default-query-executor.js");
const object_utils_js_1 = require("./util/object-utils.js");
const runtime_driver_js_1 = require("./driver/runtime-driver.js");
const single_connection_provider_js_1 = require("./driver/single-connection-provider.js");
const driver_js_1 = require("./driver/driver.js");
const function_module_js_1 = require("./query-builder/function-module.js");
const log_js_1 = require("./util/log.js");
const query_id_js_1 = require("./util/query-id.js");
const compilable_js_1 = require("./util/compilable.js");
const case_builder_js_1 = require("./query-builder/case-builder.js");
const case_node_js_1 = require("./operation-node/case-node.js");
const expression_parser_js_1 = require("./parser/expression-parser.js");
const with_schema_plugin_js_1 = require("./plugin/with-schema/with-schema-plugin.js");
const provide_controlled_connection_js_1 = require("./util/provide-controlled-connection.js");
const log_once_js_1 = require("./util/log-once.js");
// @ts-ignore
Symbol.asyncDispose ??= Symbol('Symbol.asyncDispose');
/**
* The main Kysely class.
*
* You should create one instance of `Kysely` per database using the {@link Kysely}
* constructor. Each `Kysely` instance maintains its own connection pool.
*
* ### Examples
*
* This example assumes your database has a "person" table:
*
* ```ts
* import * as Sqlite from 'better-sqlite3'
* import { type Generated, Kysely, SqliteDialect } from 'kysely'
*
* interface Database {
* person: {
* id: Generated<number>
* first_name: string
* last_name: string | null
* }
* }
*
* const db = new Kysely<Database>({
* dialect: new SqliteDialect({
* database: new Sqlite(':memory:'),
* })
* })
* ```
*
* @typeParam DB - The database interface type. Keys of this type must be table names
* in the database and values must be interfaces that describe the rows in those
* tables. See the examples above.
*/
class Kysely extends query_creator_js_1.QueryCreator {
#props;
constructor(args) {
let superProps;
let props;
if (isKyselyProps(args)) {
superProps = { executor: args.executor };
props = { ...args };
}
else {
const dialect = args.dialect;
const driver = dialect.createDriver();
const compiler = dialect.createQueryCompiler();
const adapter = dialect.createAdapter();
const log = new log_js_1.Log(args.log ?? []);
const runtimeDriver = new runtime_driver_js_1.RuntimeDriver(driver, log);
const connectionProvider = new default_connection_provider_js_1.DefaultConnectionProvider(runtimeDriver);
const executor = new default_query_executor_js_1.DefaultQueryExecutor(compiler, adapter, connectionProvider, args.plugins ?? []);
superProps = { executor };
props = {
config: args,
executor,
dialect,
driver: runtimeDriver,
};
}
super(superProps);
this.#props = (0, object_utils_js_1.freeze)(props);
}
/**
* Returns the {@link SchemaModule} module for building database schema.
*/
get schema() {
return new schema_js_1.SchemaModule(this.#props.executor);
}
/**
* Returns a the {@link DynamicModule} module.
*
* The {@link DynamicModule} module can be used to bypass strict typing and
* passing in dynamic values for the queries.
*/
get dynamic() {
return new dynamic_js_1.DynamicModule();
}
/**
* Returns a {@link DatabaseIntrospector | database introspector}.
*/
get introspection() {
return this.#props.dialect.createIntrospector(this.withoutPlugins());
}
case(value) {
return new case_builder_js_1.CaseBuilder({
node: case_node_js_1.CaseNode.create((0, object_utils_js_1.isUndefined)(value) ? undefined : (0, expression_parser_js_1.parseExpression)(value)),
});
}
/**
* Returns a {@link FunctionModule} that can be used to write somewhat type-safe function
* calls.
*
* ```ts
* const { count } = db.fn
*
* await db.selectFrom('person')
* .innerJoin('pet', 'pet.owner_id', 'person.id')
* .select([
* 'id',
* count('pet.id').as('person_count'),
* ])
* .groupBy('person.id')
* .having(count('pet.id'), '>', 10)
* .execute()
* ```
*
* The generated SQL (PostgreSQL):
*
* ```sql
* select "person"."id", count("pet"."id") as "person_count"
* from "person"
* inner join "pet" on "pet"."owner_id" = "person"."id"
* group by "person"."id"
* having count("pet"."id") > $1
* ```
*
* Why "somewhat" type-safe? Because the function calls are not bound to the
* current query context. They allow you to reference columns and tables that
* are not in the current query. E.g. remove the `innerJoin` from the previous
* query and TypeScript won't even complain.
*
* If you want to make the function calls fully type-safe, you can use the
* {@link ExpressionBuilder.fn} getter for a query context-aware, stricter {@link FunctionModule}.
*
* ```ts
* await db.selectFrom('person')
* .innerJoin('pet', 'pet.owner_id', 'person.id')
* .select((eb) => [
* 'person.id',
* eb.fn.count('pet.id').as('pet_count')
* ])
* .groupBy('person.id')
* .having((eb) => eb.fn.count('pet.id'), '>', 10)
* .execute()
* ```
*/
get fn() {
return (0, function_module_js_1.createFunctionModule)();
}
/**
* Creates a {@link TransactionBuilder} that can be used to run queries inside a transaction.
*
* The returned {@link TransactionBuilder} can be used to configure the transaction. The
* {@link TransactionBuilder.execute} method can then be called to run the transaction.
* {@link TransactionBuilder.execute} takes a function that is run inside the
* transaction. If the function throws an exception,
* 1. the exception is caught,
* 2. the transaction is rolled back, and
* 3. the exception is thrown again.
* Otherwise the transaction is committed.
*
* The callback function passed to the {@link TransactionBuilder.execute | execute}
* method gets the transaction object as its only argument. The transaction is
* of type {@link Transaction} which inherits {@link Kysely}. Any query
* started through the transaction object is executed inside the transaction.
*
* To run a controlled transaction, allowing you to commit and rollback manually,
* use {@link startTransaction} instead.
*
* ### Examples
*
* <!-- siteExample("transactions", "Simple transaction", 10) -->
*
* This example inserts two rows in a transaction. If an exception is thrown inside
* the callback passed to the `execute` method,
* 1. the exception is caught,
* 2. the transaction is rolled back, and
* 3. the exception is thrown again.
* Otherwise the transaction is committed.
*
* ```ts
* const catto = await db.transaction().execute(async (trx) => {
* const jennifer = await trx.insertInto('person')
* .values({
* first_name: 'Jennifer',
* last_name: 'Aniston',
* age: 40,
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* return await trx.insertInto('pet')
* .values({
* owner_id: jennifer.id,
* name: 'Catto',
* species: 'cat',
* is_favorite: false,
* })
* .returningAll()
* .executeTakeFirst()
* })
* ```
*
* Setting the isolation level:
*
* ```ts
* import type { Kysely } from 'kysely'
*
* await db
* .transaction()
* .setIsolationLevel('serializable')
* .execute(async (trx) => {
* await doStuff(trx)
* })
*
* async function doStuff(kysely: typeof db) {
* // ...
* }
* ```
*/
transaction() {
return new TransactionBuilder({ ...this.#props });
}
/**
* Creates a {@link ControlledTransactionBuilder} that can be used to run queries inside a controlled transaction.
*
* The returned {@link ControlledTransactionBuilder} can be used to configure the transaction.
* The {@link ControlledTransactionBuilder.execute} method can then be called
* to start the transaction and return a {@link ControlledTransaction}.
*
* A {@link ControlledTransaction} allows you to commit and rollback manually,
* execute savepoint commands. It extends {@link Transaction} which extends {@link Kysely},
* so you can run queries inside the transaction. Once the transaction is committed,
* or rolled back, it can't be used anymore - all queries will throw an error.
* This is to prevent accidentally running queries outside the transaction - where
* atomicity is not guaranteed anymore.
*
* ### Examples
*
* <!-- siteExample("transactions", "Controlled transaction", 11) -->
*
* A controlled transaction allows you to commit and rollback manually, execute
* savepoint commands, and queries in general.
*
* In this example we start a transaction, use it to insert two rows and then commit
* the transaction. If an error is thrown, we catch it and rollback the transaction.
*
* ```ts
* const trx = await db.startTransaction().execute()
*
* try {
* const jennifer = await trx.insertInto('person')
* .values({
* first_name: 'Jennifer',
* last_name: 'Aniston',
* age: 40,
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* const catto = await trx.insertInto('pet')
* .values({
* owner_id: jennifer.id,
* name: 'Catto',
* species: 'cat',
* is_favorite: false,
* })
* .returningAll()
* .executeTakeFirstOrThrow()
*
* await trx.commit().execute()
*
* // ...
* } catch (error) {
* await trx.rollback().execute()
* }
* ```
*
* <!-- siteExample("transactions", "Controlled transaction /w savepoints", 12) -->
*
* A controlled transaction allows you to commit and rollback manually, execute
* savepoint commands, and queries in general.
*
* In this example we start a transaction, insert a person, create a savepoint,
* try inserting a toy and a pet, and if an error is thrown, we rollback to the
* savepoint. Eventually we release the savepoint, insert an audit record and
* commit the transaction. If an error is thrown, we catch it and rollback the
* transaction.
*
* ```ts
* const trx = await db.startTransaction().execute()
*
* try {
* const jennifer = await trx
* .insertInto('person')
* .values({
* first_name: 'Jennifer',
* last_name: 'Aniston',
* age: 40,
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* const catto = await trxAfterJennifer
* .insertInto('pet')
* .values({
* owner_id: jennifer.id,
* name: 'Catto',
* species: 'cat',
* })
* .returning('id')
* .executeTakeFirstOrThrow()
*
* await trxAfterJennifer
* .insertInto('toy')
* .values({ name: 'Bone', price: 1.99, pet_id: catto.id })
* .execute()
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* await trxAfterJennifer.releaseSavepoint('after_jennifer').execute()
*
* await trx.insertInto('audit').values({ action: 'added Jennifer' }).execute()
*
* await trx.commit().execute()
* } catch (error) {
* await trx.rollback().execute()
* }
* ```
*/
startTransaction() {
return new ControlledTransactionBuilder({ ...this.#props });
}
/**
* Provides a kysely instance bound to a single database connection.
*
* ### Examples
*
* ```ts
* await db
* .connection()
* .execute(async (db) => {
* // `db` is an instance of `Kysely` that's bound to a single
* // database connection. All queries executed through `db` use
* // the same connection.
* await doStuff(db)
* })
*
* async function doStuff(kysely: typeof db) {
* // ...
* }
* ```
*/
connection() {
return new ConnectionBuilder({ ...this.#props });
}
/**
* Returns a copy of this Kysely instance with the given plugin installed.
*/
withPlugin(plugin) {
return new Kysely({
...this.#props,
executor: this.#props.executor.withPlugin(plugin),
});
}
/**
* Returns a copy of this Kysely instance without any plugins.
*/
withoutPlugins() {
return new Kysely({
...this.#props,
executor: this.#props.executor.withoutPlugins(),
});
}
/**
* @override
*/
withSchema(schema) {
return new Kysely({
...this.#props,
executor: this.#props.executor.withPluginAtFront(new with_schema_plugin_js_1.WithSchemaPlugin(schema)),
});
}
/**
* Returns a copy of this Kysely instance with tables added to its
* database type.
*
* This method only modifies the types and doesn't affect any of the
* executed queries in any way.
*
* ### Examples
*
* The following example adds and uses a temporary table:
*
* ```ts
* await db.schema
* .createTable('temp_table')
* .temporary()
* .addColumn('some_column', 'integer')
* .execute()
*
* const tempDb = db.withTables<{
* temp_table: {
* some_column: number
* }
* }>()
*
* await tempDb
* .insertInto('temp_table')
* .values({ some_column: 100 })
* .execute()
* ```
*/
withTables() {
return new Kysely({ ...this.#props });
}
/**
* Releases all resources and disconnects from the database.
*
* You need to call this when you are done using the `Kysely` instance.
*/
async destroy() {
await this.#props.driver.destroy();
}
/**
* Returns true if this `Kysely` instance is a transaction.
*
* You can also use `db instanceof Transaction`.
*/
get isTransaction() {
return false;
}
/**
* @internal
* @private
*/
getExecutor() {
return this.#props.executor;
}
/**
* Executes a given compiled query or query builder.
*
* See {@link https://github.com/kysely-org/kysely/blob/master/site/docs/recipes/0004-splitting-query-building-and-execution.md#execute-compiled-queries splitting build, compile and execute code recipe} for more information.
*/
executeQuery(query,
// TODO: remove this in the future. deprecated in 0.28.x
queryId) {
if (queryId !== undefined) {
(0, log_once_js_1.logOnce)('Passing `queryId` in `db.executeQuery` is deprecated and will result in a compile-time error in the future.');
}
const compiledQuery = (0, compilable_js_1.isCompilable)(query) ? query.compile() : query;
return this.getExecutor().executeQuery(compiledQuery);
}
async [Symbol.asyncDispose]() {
await this.destroy();
}
}
exports.Kysely = Kysely;
class Transaction extends Kysely {
#props;
constructor(props) {
super(props);
this.#props = props;
}
// The return type is `true` instead of `boolean` to make Kysely<DB>
// unassignable to Transaction<DB> while allowing assignment the
// other way around.
get isTransaction() {
return true;
}
transaction() {
throw new Error('calling the transaction method for a Transaction is not supported');
}
connection() {
throw new Error('calling the connection method for a Transaction is not supported');
}
async destroy() {
throw new Error('calling the destroy method for a Transaction is not supported');
}
withPlugin(plugin) {
return new Transaction({
...this.#props,
executor: this.#props.executor.withPlugin(plugin),
});
}
withoutPlugins() {
return new Transaction({
...this.#props,
executor: this.#props.executor.withoutPlugins(),
});
}
withSchema(schema) {
return new Transaction({
...this.#props,
executor: this.#props.executor.withPluginAtFront(new with_schema_plugin_js_1.WithSchemaPlugin(schema)),
});
}
withTables() {
return new Transaction({ ...this.#props });
}
}
exports.Transaction = Transaction;
function isKyselyProps(obj) {
return ((0, object_utils_js_1.isObject)(obj) &&
(0, object_utils_js_1.isObject)(obj.config) &&
(0, object_utils_js_1.isObject)(obj.driver) &&
(0, object_utils_js_1.isObject)(obj.executor) &&
(0, object_utils_js_1.isObject)(obj.dialect));
}
class ConnectionBuilder {
#props;
constructor(props) {
this.#props = (0, object_utils_js_1.freeze)(props);
}
async execute(callback) {
return this.#props.executor.provideConnection(async (connection) => {
const executor = this.#props.executor.withConnectionProvider(new single_connection_provider_js_1.SingleConnectionProvider(connection));
const db = new Kysely({
...this.#props,
executor,
});
return await callback(db);
});
}
}
exports.ConnectionBuilder = ConnectionBuilder;
class TransactionBuilder {
#props;
constructor(props) {
this.#props = (0, object_utils_js_1.freeze)(props);
}
setAccessMode(accessMode) {
return new TransactionBuilder({
...this.#props,
accessMode,
});
}
setIsolationLevel(isolationLevel) {
return new TransactionBuilder({
...this.#props,
isolationLevel,
});
}
async execute(callback) {
const { isolationLevel, accessMode, ...kyselyProps } = this.#props;
const settings = { isolationLevel, accessMode };
(0, driver_js_1.validateTransactionSettings)(settings);
return this.#props.executor.provideConnection(async (connection) => {
const state = { isCommitted: false, isRolledBack: false };
const executor = new NotCommittedOrRolledBackAssertingExecutor(this.#props.executor.withConnectionProvider(new single_connection_provider_js_1.SingleConnectionProvider(connection)), state);
const transaction = new Transaction({
...kyselyProps,
executor,
});
let transactionBegun = false;
try {
await this.#props.driver.beginTransaction(connection, settings);
transactionBegun = true;
const result = await callback(transaction);
await this.#props.driver.commitTransaction(connection);
state.isCommitted = true;
return result;
}
catch (error) {
if (transactionBegun) {
await this.#props.driver.rollbackTransaction(connection);
state.isRolledBack = true;
}
throw error;
}
});
}
}
exports.TransactionBuilder = TransactionBuilder;
class ControlledTransactionBuilder {
#props;
constructor(props) {
this.#props = (0, object_utils_js_1.freeze)(props);
}
setAccessMode(accessMode) {
return new ControlledTransactionBuilder({
...this.#props,
accessMode,
});
}
setIsolationLevel(isolationLevel) {
return new ControlledTransactionBuilder({
...this.#props,
isolationLevel,
});
}
async execute() {
const { isolationLevel, accessMode, ...props } = this.#props;
const settings = { isolationLevel, accessMode };
(0, driver_js_1.validateTransactionSettings)(settings);
const connection = await (0, provide_controlled_connection_js_1.provideControlledConnection)(this.#props.executor);
await this.#props.driver.beginTransaction(connection.connection, settings);
return new ControlledTransaction({
...props,
connection,
executor: this.#props.executor.withConnectionProvider(new single_connection_provider_js_1.SingleConnectionProvider(connection.connection)),
});
}
}
exports.ControlledTransactionBuilder = ControlledTransactionBuilder;
class ControlledTransaction extends Transaction {
#props;
#compileQuery;
#state;
constructor(props) {
const state = { isCommitted: false, isRolledBack: false };
props = {
...props,
executor: new NotCommittedOrRolledBackAssertingExecutor(props.executor, state),
};
const { connection, ...transactionProps } = props;
super(transactionProps);
this.#props = (0, object_utils_js_1.freeze)(props);
this.#state = state;
const queryId = (0, query_id_js_1.createQueryId)();
this.#compileQuery = (node) => props.executor.compileQuery(node, queryId);
}
get isCommitted() {
return this.#state.isCommitted;
}
get isRolledBack() {
return this.#state.isRolledBack;
}
/**
* Commits the transaction.
*
* See {@link rollback}.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* try {
* await doSomething(trx)
*
* await trx.commit().execute()
* } catch (error) {
* await trx.rollback().execute()
* }
*
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
commit() {
assertNotCommittedOrRolledBack(this.#state);
return new Command(async () => {
await this.#props.driver.commitTransaction(this.#props.connection.connection);
this.#state.isCommitted = true;
this.#props.connection.release();
});
}
/**
* Rolls back the transaction.
*
* See {@link commit} and {@link rollbackToSavepoint}.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* try {
* await doSomething(trx)
*
* await trx.commit().execute()
* } catch (error) {
* await trx.rollback().execute()
* }
*
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
rollback() {
assertNotCommittedOrRolledBack(this.#state);
return new Command(async () => {
await this.#props.driver.rollbackTransaction(this.#props.connection.connection);
this.#state.isRolledBack = true;
this.#props.connection.release();
});
}
/**
* Creates a savepoint with a given name.
*
* See {@link rollbackToSavepoint} and {@link releaseSavepoint}.
*
* For a type-safe experience, you should use the returned instance from now on.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* await insertJennifer(trx)
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* await doSomething(trxAfterJennifer)
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* async function insertJennifer(kysely: Kysely<Database>) {}
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
savepoint(savepointName) {
assertNotCommittedOrRolledBack(this.#state);
return new Command(async () => {
await this.#props.driver.savepoint?.(this.#props.connection.connection, savepointName, this.#compileQuery);
return new ControlledTransaction({ ...this.#props });
});
}
/**
* Rolls back to a savepoint with a given name.
*
* See {@link savepoint} and {@link releaseSavepoint}.
*
* You must use the same instance returned by {@link savepoint}, or
* escape the type-check by using `as any`.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* await insertJennifer(trx)
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* await doSomething(trxAfterJennifer)
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* async function insertJennifer(kysely: Kysely<Database>) {}
* async function doSomething(kysely: Kysely<Database>) {}
* ```
*/
rollbackToSavepoint(savepointName) {
assertNotCommittedOrRolledBack(this.#state);
return new Command(async () => {
await this.#props.driver.rollbackToSavepoint?.(this.#props.connection.connection, savepointName, this.#compileQuery);
return new ControlledTransaction({ ...this.#props });
});
}
/**
* Releases a savepoint with a given name.
*
* See {@link savepoint} and {@link rollbackToSavepoint}.
*
* You must use the same instance returned by {@link savepoint}, or
* escape the type-check by using `as any`.
*
* ### Examples
*
* ```ts
* import type { Kysely } from 'kysely'
* import type { Database } from 'type-editor' // imaginary module
*
* const trx = await db.startTransaction().execute()
*
* await insertJennifer(trx)
*
* const trxAfterJennifer = await trx.savepoint('after_jennifer').execute()
*
* try {
* await doSomething(trxAfterJennifer)
* } catch (error) {
* await trxAfterJennifer.rollbackToSavepoint('after_jennifer').execute()
* }
*
* await trxAfterJennifer.releaseSavepoint('after_jennifer').execute()
*
* await doSomethingElse(trx)
*
* async function insertJennifer(kysely: Kysely<Database>) {}
* async function doSomething(kysely: Kysely<Database>) {}
* async function doSomethingElse(kysely: Kysely<Database>) {}
* ```
*/
releaseSavepoint(savepointName) {
assertNotCommittedOrRolledBack(this.#state);
return new Command(async () => {
await this.#props.driver.releaseSavepoint?.(this.#props.connection.connection, savepointName, this.#compileQuery);
return new ControlledTransaction({ ...this.#props });
});
}
withPlugin(plugin) {
return new ControlledTransaction({
...this.#props,
executor: this.#props.executor.withPlugin(plugin),
});
}
withoutPlugins() {
return new ControlledTransaction({
...this.#props,
executor: this.#props.executor.withoutPlugins(),
});
}
withSchema(schema) {
return new ControlledTransaction({
...this.#props,
executor: this.#props.executor.withPluginAtFront(new with_schema_plugin_js_1.WithSchemaPlugin(schema)),
});
}
withTables() {
return new ControlledTransaction({ ...this.#props });
}
}
exports.ControlledTransaction = ControlledTransaction;
class Command {
#cb;
constructor(cb) {
this.#cb = cb;
}
/**
* Executes the command.
*/
async execute() {
return await this.#cb();
}
}
exports.Command = Command;
function assertNotCommittedOrRolledBack(state) {
if (state.isCommitted) {
throw new Error('Transaction is already committed');
}
if (state.isRolledBack) {
throw new Error('Transaction is already rolled back');
}
}
/**
* An executor wrapper that asserts that the transaction state is not committed
* or rolled back when a query is executed.
*
* @internal
*/
class NotCommittedOrRolledBackAssertingExecutor {
#executor;
#state;
constructor(executor, state) {
if (executor instanceof NotCommittedOrRolledBackAssertingExecutor) {
this.#executor = executor.#executor;
}
else {
this.#executor = executor;
}
this.#state = state;
}
get adapter() {
return this.#executor.adapter;
}
get plugins() {
return this.#executor.plugins;
}
transformQuery(node, queryId) {
return this.#executor.transformQuery(node, queryId);
}
compileQuery(node, queryId) {
return this.#executor.compileQuery(node, queryId);
}
provideConnection(consumer) {
return this.#executor.provideConnection(consumer);
}
executeQuery(compiledQuery) {
assertNotCommittedOrRolledBack(this.#state);
return this.#executor.executeQuery(compiledQuery);
}
stream(compiledQuery, chunkSize) {
assertNotCommittedOrRolledBack(this.#state);
return this.#executor.stream(compiledQuery, chunkSize);
}
withConnectionProvider(connectionProvider) {
return new NotCommittedOrRolledBackAssertingExecutor(this.#executor.withConnectionProvider(connectionProvider), this.#state);
}
withPlugin(plugin) {
return new NotCommittedOrRolledBackAssertingExecutor(this.#executor.withPlugin(plugin), this.#state);
}
withPlugins(plugins) {
return new NotCommittedOrRolledBackAssertingExecutor(this.#executor.withPlugins(plugins), this.#state);
}
withPluginAtFront(plugin) {
return new NotCommittedOrRolledBackAssertingExecutor(this.#executor.withPluginAtFront(plugin), this.#state);
}
withoutPlugins() {
return new NotCommittedOrRolledBackAssertingExecutor(this.#executor.withoutPlugins(), this.#state);
}
}

View File

@@ -0,0 +1,48 @@
import type { Migration, MigrationProvider } from './migrator.js';
/**
* Reads all migrations from a folder in node.js.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
*
* new FileMigrationProvider({
* fs,
* path,
* migrationFolder: 'path/to/migrations/folder'
* })
* ```
*/
export declare class FileMigrationProvider implements MigrationProvider {
#private;
constructor(props: FileMigrationProviderProps);
/**
* Returns all migrations, old and new.
*
* For example if you have your migrations in a folder as separate files,
* you can implement this method to return all migration in that folder
* as {@link Migration} objects.
*
* Kysely already has a built-in {@link FileMigrationProvider} for node.js
* that does exactly that.
*
* The keys of the returned object are migration names and values are the
* migrations. The order of the migrations is determined by the alphabetical
* order of the migration names. The items in the object don't need to be
* sorted, they are sorted by Kysely.
*/
getMigrations(): Promise<Record<string, Migration>>;
}
export interface FileMigrationProviderFS {
readdir(path: string): Promise<string[]>;
}
export interface FileMigrationProviderPath {
join(...path: string[]): string;
}
export interface FileMigrationProviderProps {
fs: FileMigrationProviderFS;
path: FileMigrationProviderPath;
migrationFolder: string;
}

View File

@@ -0,0 +1,52 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.FileMigrationProvider = void 0;
const object_utils_js_1 = require("../util/object-utils.js");
/**
* Reads all migrations from a folder in node.js.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
*
* new FileMigrationProvider({
* fs,
* path,
* migrationFolder: 'path/to/migrations/folder'
* })
* ```
*/
class FileMigrationProvider {
#props;
constructor(props) {
this.#props = props;
}
async getMigrations() {
const migrations = {};
const files = await this.#props.fs.readdir(this.#props.migrationFolder);
for (const fileName of files) {
if (fileName.endsWith('.js') ||
(fileName.endsWith('.ts') && !fileName.endsWith('.d.ts')) ||
fileName.endsWith('.mjs') ||
(fileName.endsWith('.mts') && !fileName.endsWith('.d.mts'))) {
const migration = await Promise.resolve(`${
/* webpackIgnore: true */ this.#props.path.join(this.#props.migrationFolder, fileName)}`).then(s => require(s));
const migrationKey = fileName.substring(0, fileName.lastIndexOf('.'));
// Handle esModuleInterop export's `default` prop...
if (isMigration(migration?.default)) {
migrations[migrationKey] = migration.default;
}
else if (isMigration(migration)) {
migrations[migrationKey] = migration;
}
}
}
return migrations;
}
}
exports.FileMigrationProvider = FileMigrationProvider;
function isMigration(obj) {
return (0, object_utils_js_1.isObject)(obj) && (0, object_utils_js_1.isFunction)(obj.up);
}

395
node_modules/kysely/dist/cjs/migration/migrator.d.ts generated vendored Normal file
View File

@@ -0,0 +1,395 @@
import type { Kysely } from '../kysely.js';
export declare const DEFAULT_MIGRATION_TABLE = "kysely_migration";
export declare const DEFAULT_MIGRATION_LOCK_TABLE = "kysely_migration_lock";
export declare const DEFAULT_ALLOW_UNORDERED_MIGRATIONS = false;
export declare const MIGRATION_LOCK_ID = "migration_lock";
export declare const NO_MIGRATIONS: NoMigrations;
export interface Migration {
up(db: Kysely<any>): Promise<void>;
/**
* An optional down method.
*
* If you don't provide a down method, the migration is skipped when
* migrating down.
*/
down?(db: Kysely<any>): Promise<void>;
}
/**
* A class for running migrations.
*
* ### Example
*
* This example uses the {@link FileMigrationProvider} that reads migrations
* files from a single folder. You can easily implement your own
* {@link MigrationProvider} if you want to provide migrations some
* other way.
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import * as Sqlite from 'better-sqlite3'
* import {
* FileMigrationProvider,
* Kysely,
* Migrator,
* SqliteDialect
* } from 'kysely'
*
* const db = new Kysely<any>({
* dialect: new SqliteDialect({
* database: Sqlite(':memory:')
* })
* })
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
* ```
*/
export declare class Migrator {
#private;
constructor(props: MigratorProps);
/**
* Returns a {@link MigrationInfo} object for each migration.
*
* The returned array is sorted by migration name.
*/
getMigrations(): Promise<ReadonlyArray<MigrationInfo>>;
/**
* Runs all migrations that have not yet been run.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed. See the examples below.
*
* This method goes through all possible migrations provided by the provider and runs the
* ones whose names come alphabetically after the last migration that has been run. If the
* list of executed migrations doesn't match the beginning of the list of possible migrations
* an error is returned.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import * as Sqlite from 'better-sqlite3'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* const { error, results } = await migrator.migrateToLatest()
*
* results?.forEach((it) => {
* if (it.status === 'Success') {
* console.log(`migration "${it.migrationName}" was executed successfully`)
* } else if (it.status === 'Error') {
* console.error(`failed to execute migration "${it.migrationName}"`)
* }
* })
*
* if (error) {
* console.error('failed to run `migrateToLatest`')
* console.error(error)
* }
* ```
*/
migrateToLatest(): Promise<MigrationResultSet>;
/**
* Migrate up/down to a specific migration.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateTo('some_migration')
* ```
*
* If you specify the name of the first migration, this method migrates
* down to the first migration, but doesn't run the `down` method of
* the first migration. In case you want to migrate all the way down,
* you can use a special constant `NO_MIGRATIONS`:
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator, NO_MIGRATIONS } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateTo(NO_MIGRATIONS)
* ```
*/
migrateTo(targetMigrationName: string | NoMigrations): Promise<MigrationResultSet>;
/**
* Migrate one step up.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateUp()
* ```
*/
migrateUp(): Promise<MigrationResultSet>;
/**
* Migrate one step down.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateDown()
* ```
*/
migrateDown(): Promise<MigrationResultSet>;
}
export interface MigratorProps {
readonly db: Kysely<any>;
readonly provider: MigrationProvider;
/**
* The name of the internal migration table. Defaults to `kysely_migration`.
*
* If you do specify this, you need to ALWAYS use the same value. Kysely doesn't
* support changing the table on the fly. If you run the migrator even once with a
* table name X and then change the table name to Y, kysely will create a new empty
* migration table and attempt to run the migrations again, which will obviously
* fail.
*
* If you do specify this, ALWAYS ALWAYS use the same value from the beginning of
* the project, to the end of time or prepare to manually migrate the migration
* tables.
*/
readonly migrationTableName?: string;
/**
* The name of the internal migration lock table. Defaults to `kysely_migration_lock`.
*
* If you do specify this, you need to ALWAYS use the same value. Kysely doesn't
* support changing the table on the fly. If you run the migrator even once with a
* table name X and then change the table name to Y, kysely will create a new empty
* lock table.
*
* If you do specify this, ALWAYS ALWAYS use the same value from the beginning of
* the project, to the end of time or prepare to manually migrate the migration
* tables.
*/
readonly migrationLockTableName?: string;
/**
* The schema of the internal migration tables. Defaults to the default schema
* on dialects that support schemas.
*
* If you do specify this, you need to ALWAYS use the same value. Kysely doesn't
* support changing the schema on the fly. If you run the migrator even once with a
* schema name X and then change the schema name to Y, kysely will create a new empty
* migration tables in the new schema and attempt to run the migrations again, which
* will obviously fail.
*
* If you do specify this, ALWAYS ALWAYS use the same value from the beginning of
* the project, to the end of time or prepare to manually migrate the migration
* tables.
*
* This only works on postgres and mssql.
*/
readonly migrationTableSchema?: string;
/**
* Enforces whether or not migrations must be run in alpha-numeric order.
*
* When false, migrations must be run in their exact alpha-numeric order.
* This is checked against the migrations already run in the database
* (`migrationTableName`). This ensures your migrations are always run in
* the same order and is the safest option.
*
* When true, migrations are still run in alpha-numeric order, but
* the order is not checked against already-run migrations in the database.
* Kysely will simply run all migrations that haven't run yet, in alpha-numeric
* order.
*/
readonly allowUnorderedMigrations?: boolean;
/**
* A function that compares migration names, used when sorting migrations in
* ascending order.
*
* Default is `name0.localeCompare(name1)`.
*/
readonly nameComparator?: (name0: string, name1: string) => number;
/**
* When `true`, don't run migrations in transactions even if the dialect supports transactional DDL.
*
* Default is `false`.
*
* This is useful when some migrations include queries that would fail otherwise.
*/
readonly disableTransactions?: boolean;
}
/**
* All migration methods ({@link Migrator.migrateTo | migrateTo},
* {@link Migrator.migrateToLatest | migrateToLatest} etc.) never
* throw but return this object instead.
*/
export interface MigrationResultSet {
/**
* This is defined if something went wrong.
*
* An error may have occurred in one of the migrations in which case the
* {@link results} list contains an item with `status === 'Error'` to
* indicate which migration failed.
*
* An error may also have occurred before Kysely was able to figure out
* which migrations should be executed, in which case the {@link results}
* list is undefined.
*/
readonly error?: unknown;
/**
* {@link MigrationResult} for each individual migration that was supposed
* to be executed by the operation.
*
* If all went well, each result's `status` is `Success`. If some migration
* failed, the failed migration's result's `status` is `Error` and all
* results after that one have `status` ´NotExecuted`.
*
* This property can be undefined if an error occurred before Kysely was
* able to figure out which migrations should be executed.
*
* If this list is empty, there were no migrations to execute.
*/
readonly results?: MigrationResult[];
}
type MigrationDirection = 'Up' | 'Down';
export interface MigrationResult {
readonly migrationName: string;
/**
* The direction in which this migration was executed.
*/
readonly direction: MigrationDirection;
/**
* The execution status.
*
* - `Success` means the migration was successfully executed. Note that
* if any of the later migrations in the {@link MigrationResultSet.results}
* list failed (have status `Error`) AND the dialect supports transactional
* DDL, even the successfull migrations were rolled back.
*
* - `Error` means the migration failed. In this case the
* {@link MigrationResultSet.error} contains the error.
*
* - `NotExecuted` means that the migration was supposed to be executed
* but wasn't because an earlier migration failed.
*/
readonly status: 'Success' | 'Error' | 'NotExecuted';
}
export interface MigrationProvider {
/**
* Returns all migrations, old and new.
*
* For example if you have your migrations in a folder as separate files,
* you can implement this method to return all migration in that folder
* as {@link Migration} objects.
*
* Kysely already has a built-in {@link FileMigrationProvider} for node.js
* that does exactly that.
*
* The keys of the returned object are migration names and values are the
* migrations. The order of the migrations is determined by the alphabetical
* order of the migration names. The items in the object don't need to be
* sorted, they are sorted by Kysely.
*/
getMigrations(): Promise<Record<string, Migration>>;
}
/**
* Type for the {@link NO_MIGRATIONS} constant. Never create one of these.
*/
export interface NoMigrations {
readonly __noMigrations__: true;
}
export interface MigrationInfo {
/**
* Name of the migration.
*/
name: string;
/**
* The actual migration.
*/
migration: Migration;
/**
* When was the migration executed.
*
* If this is undefined, the migration hasn't been executed yet.
*/
executedAt?: Date;
}
export {};

611
node_modules/kysely/dist/cjs/migration/migrator.js generated vendored Normal file
View File

@@ -0,0 +1,611 @@
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.Migrator = exports.NO_MIGRATIONS = exports.MIGRATION_LOCK_ID = exports.DEFAULT_ALLOW_UNORDERED_MIGRATIONS = exports.DEFAULT_MIGRATION_LOCK_TABLE = exports.DEFAULT_MIGRATION_TABLE = void 0;
const noop_plugin_js_1 = require("../plugin/noop-plugin.js");
const with_schema_plugin_js_1 = require("../plugin/with-schema/with-schema-plugin.js");
const object_utils_js_1 = require("../util/object-utils.js");
exports.DEFAULT_MIGRATION_TABLE = 'kysely_migration';
exports.DEFAULT_MIGRATION_LOCK_TABLE = 'kysely_migration_lock';
exports.DEFAULT_ALLOW_UNORDERED_MIGRATIONS = false;
exports.MIGRATION_LOCK_ID = 'migration_lock';
exports.NO_MIGRATIONS = (0, object_utils_js_1.freeze)({ __noMigrations__: true });
/**
* A class for running migrations.
*
* ### Example
*
* This example uses the {@link FileMigrationProvider} that reads migrations
* files from a single folder. You can easily implement your own
* {@link MigrationProvider} if you want to provide migrations some
* other way.
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import * as Sqlite from 'better-sqlite3'
* import {
* FileMigrationProvider,
* Kysely,
* Migrator,
* SqliteDialect
* } from 'kysely'
*
* const db = new Kysely<any>({
* dialect: new SqliteDialect({
* database: Sqlite(':memory:')
* })
* })
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
* ```
*/
class Migrator {
#props;
constructor(props) {
this.#props = (0, object_utils_js_1.freeze)(props);
}
/**
* Returns a {@link MigrationInfo} object for each migration.
*
* The returned array is sorted by migration name.
*/
async getMigrations() {
const tableExists = await this.#doesTableExist(this.#migrationTable);
const executedMigrations = tableExists
? await this.#props.db
.withPlugin(this.#schemaPlugin)
.selectFrom(this.#migrationTable)
.select(['name', 'timestamp'])
.$narrowType()
.execute()
: [];
const migrations = await this.#resolveMigrations();
return migrations.map(({ name, ...migration }) => {
const executed = executedMigrations.find((it) => it.name === name);
return {
name,
migration,
executedAt: executed ? new Date(executed.timestamp) : undefined,
};
});
}
/**
* Runs all migrations that have not yet been run.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed. See the examples below.
*
* This method goes through all possible migrations provided by the provider and runs the
* ones whose names come alphabetically after the last migration that has been run. If the
* list of executed migrations doesn't match the beginning of the list of possible migrations
* an error is returned.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import * as Sqlite from 'better-sqlite3'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* const { error, results } = await migrator.migrateToLatest()
*
* results?.forEach((it) => {
* if (it.status === 'Success') {
* console.log(`migration "${it.migrationName}" was executed successfully`)
* } else if (it.status === 'Error') {
* console.error(`failed to execute migration "${it.migrationName}"`)
* }
* })
*
* if (error) {
* console.error('failed to run `migrateToLatest`')
* console.error(error)
* }
* ```
*/
async migrateToLatest() {
return this.#migrate(() => ({ direction: 'Up', step: Infinity }));
}
/**
* Migrate up/down to a specific migration.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateTo('some_migration')
* ```
*
* If you specify the name of the first migration, this method migrates
* down to the first migration, but doesn't run the `down` method of
* the first migration. In case you want to migrate all the way down,
* you can use a special constant `NO_MIGRATIONS`:
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator, NO_MIGRATIONS } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateTo(NO_MIGRATIONS)
* ```
*/
async migrateTo(targetMigrationName) {
return this.#migrate(({ migrations, executedMigrations, pendingMigrations, }) => {
if ((0, object_utils_js_1.isObject)(targetMigrationName) &&
targetMigrationName.__noMigrations__ === true) {
return { direction: 'Down', step: Infinity };
}
if (!migrations.find((m) => m.name === targetMigrationName)) {
throw new Error(`migration "${targetMigrationName}" doesn't exist`);
}
const executedIndex = executedMigrations.indexOf(targetMigrationName);
const pendingIndex = pendingMigrations.findIndex((m) => m.name === targetMigrationName);
if (executedIndex !== -1) {
return {
direction: 'Down',
step: executedMigrations.length - executedIndex - 1,
};
}
else if (pendingIndex !== -1) {
return { direction: 'Up', step: pendingIndex + 1 };
}
else {
throw new Error(`migration "${targetMigrationName}" isn't executed or pending`);
}
});
}
/**
* Migrate one step up.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateUp()
* ```
*/
async migrateUp() {
return this.#migrate(() => ({ direction: 'Up', step: 1 }));
}
/**
* Migrate one step down.
*
* This method returns a {@link MigrationResultSet} instance and _never_ throws.
* {@link MigrationResultSet.error} holds the error if something went wrong.
* {@link MigrationResultSet.results} contains information about which migrations
* were executed and which failed.
*
* ### Examples
*
* ```ts
* import { promises as fs } from 'node:fs'
* import path from 'node:path'
* import { FileMigrationProvider, Migrator } from 'kysely'
*
* const migrator = new Migrator({
* db,
* provider: new FileMigrationProvider({
* fs,
* // Path to the folder that contains all your migrations.
* migrationFolder: 'some/path/to/migrations',
* path,
* })
* })
*
* await migrator.migrateDown()
* ```
*/
async migrateDown() {
return this.#migrate(() => ({ direction: 'Down', step: 1 }));
}
async #migrate(getMigrationDirectionAndStep) {
try {
await this.#ensureMigrationTableSchemaExists();
await this.#ensureMigrationTableExists();
await this.#ensureMigrationLockTableExists();
await this.#ensureLockRowExists();
return await this.#runMigrations(getMigrationDirectionAndStep);
}
catch (error) {
if (error instanceof MigrationResultSetError) {
return error.resultSet;
}
return { error };
}
}
get #migrationTableSchema() {
return this.#props.migrationTableSchema;
}
get #migrationTable() {
return this.#props.migrationTableName ?? exports.DEFAULT_MIGRATION_TABLE;
}
get #migrationLockTable() {
return this.#props.migrationLockTableName ?? exports.DEFAULT_MIGRATION_LOCK_TABLE;
}
get #allowUnorderedMigrations() {
return (this.#props.allowUnorderedMigrations ?? exports.DEFAULT_ALLOW_UNORDERED_MIGRATIONS);
}
get #schemaPlugin() {
if (this.#migrationTableSchema) {
return new with_schema_plugin_js_1.WithSchemaPlugin(this.#migrationTableSchema);
}
return new noop_plugin_js_1.NoopPlugin();
}
async #ensureMigrationTableSchemaExists() {
if (!this.#migrationTableSchema) {
// Use default schema. Nothing to do.
return;
}
const schemaExists = await this.#doesSchemaExist();
if (schemaExists) {
return;
}
try {
await this.#createIfNotExists(this.#props.db.schema.createSchema(this.#migrationTableSchema));
}
catch (error) {
const schemaExists = await this.#doesSchemaExist();
// At least on PostgreSQL, `if not exists` doesn't guarantee the `create schema`
// query doesn't throw if the schema already exits. That's why we check if
// the schema exist here and ignore the error if it does.
if (!schemaExists) {
throw error;
}
}
}
async #ensureMigrationTableExists() {
const tableExists = await this.#doesTableExist(this.#migrationTable);
if (tableExists) {
return;
}
try {
await this.#createIfNotExists(this.#props.db.schema
.withPlugin(this.#schemaPlugin)
.createTable(this.#migrationTable)
.addColumn('name', 'varchar(255)', (col) => col.notNull().primaryKey())
// The migration run time as ISO string. This is not a real date type as we
// can't know which data type is supported by all future dialects.
.addColumn('timestamp', 'varchar(255)', (col) => col.notNull()));
}
catch (error) {
const tableExists = await this.#doesTableExist(this.#migrationTable);
// At least on PostgreSQL, `if not exists` doesn't guarantee the `create table`
// query doesn't throw if the table already exits. That's why we check if
// the table exist here and ignore the error if it does.
if (!tableExists) {
throw error;
}
}
}
async #ensureMigrationLockTableExists() {
const tableExists = await this.#doesTableExist(this.#migrationLockTable);
if (tableExists) {
return;
}
try {
await this.#createIfNotExists(this.#props.db.schema
.withPlugin(this.#schemaPlugin)
.createTable(this.#migrationLockTable)
.addColumn('id', 'varchar(255)', (col) => col.notNull().primaryKey())
.addColumn('is_locked', 'integer', (col) => col.notNull().defaultTo(0)));
}
catch (error) {
const tableExists = await this.#doesTableExist(this.#migrationLockTable);
// At least on PostgreSQL, `if not exists` doesn't guarantee the `create table`
// query doesn't throw if the table already exits. That's why we check if
// the table exist here and ignore the error if it does.
if (!tableExists) {
throw error;
}
}
}
async #ensureLockRowExists() {
const lockRowExists = await this.#doesLockRowExists();
if (lockRowExists) {
return;
}
try {
await this.#props.db
.withPlugin(this.#schemaPlugin)
.insertInto(this.#migrationLockTable)
.values({ id: exports.MIGRATION_LOCK_ID, is_locked: 0 })
.execute();
}
catch (error) {
const lockRowExists = await this.#doesLockRowExists();
if (!lockRowExists) {
throw error;
}
}
}
async #doesSchemaExist() {
const schemas = await this.#props.db.introspection.getSchemas();
return schemas.some((it) => it.name === this.#migrationTableSchema);
}
async #doesTableExist(tableName) {
const schema = this.#migrationTableSchema;
const tables = await this.#props.db.introspection.getTables({
withInternalKyselyTables: true,
});
return tables.some((it) => it.name === tableName && (!schema || it.schema === schema));
}
async #doesLockRowExists() {
const lockRow = await this.#props.db
.withPlugin(this.#schemaPlugin)
.selectFrom(this.#migrationLockTable)
.where('id', '=', exports.MIGRATION_LOCK_ID)
.select('id')
.executeTakeFirst();
return !!lockRow;
}
async #runMigrations(getMigrationDirectionAndStep) {
const adapter = this.#props.db.getExecutor().adapter;
const lockOptions = (0, object_utils_js_1.freeze)({
lockTable: this.#props.migrationLockTableName ?? exports.DEFAULT_MIGRATION_LOCK_TABLE,
lockRowId: exports.MIGRATION_LOCK_ID,
lockTableSchema: this.#props.migrationTableSchema,
});
const run = async (db) => {
try {
await adapter.acquireMigrationLock(db, lockOptions);
const state = await this.#getState(db);
if (state.migrations.length === 0) {
return { results: [] };
}
const { direction, step } = getMigrationDirectionAndStep(state);
if (step <= 0) {
return { results: [] };
}
if (direction === 'Down') {
return await this.#migrateDown(db, state, step);
}
else if (direction === 'Up') {
return await this.#migrateUp(db, state, step);
}
return { results: [] };
}
finally {
await adapter.releaseMigrationLock(db, lockOptions);
}
};
if (adapter.supportsTransactionalDdl && !this.#props.disableTransactions) {
return this.#props.db.transaction().execute(run);
}
else {
return this.#props.db.connection().execute(run);
}
}
async #getState(db) {
const migrations = await this.#resolveMigrations();
const executedMigrations = await this.#getExecutedMigrations(db);
this.#ensureNoMissingMigrations(migrations, executedMigrations);
if (!this.#allowUnorderedMigrations) {
this.#ensureMigrationsInOrder(migrations, executedMigrations);
}
const pendingMigrations = this.#getPendingMigrations(migrations, executedMigrations);
return (0, object_utils_js_1.freeze)({
migrations,
executedMigrations,
lastMigration: (0, object_utils_js_1.getLast)(executedMigrations),
pendingMigrations,
});
}
#getPendingMigrations(migrations, executedMigrations) {
return migrations.filter((migration) => {
return !executedMigrations.includes(migration.name);
});
}
async #resolveMigrations() {
const allMigrations = await this.#props.provider.getMigrations();
return Object.keys(allMigrations)
.sort()
.map((name) => ({
...allMigrations[name],
name,
}));
}
async #getExecutedMigrations(db) {
const executedMigrations = await db
.withPlugin(this.#schemaPlugin)
.selectFrom(this.#migrationTable)
.select(['name', 'timestamp'])
.$narrowType()
.execute();
const nameComparator = this.#props.nameComparator || ((a, b) => a.localeCompare(b));
return (executedMigrations
// https://github.com/kysely-org/kysely/issues/843
.sort((a, b) => {
if (a.timestamp === b.timestamp) {
return nameComparator(a.name, b.name);
}
return (new Date(a.timestamp).getTime() - new Date(b.timestamp).getTime());
})
.map((it) => it.name));
}
#ensureNoMissingMigrations(migrations, executedMigrations) {
// Ensure all executed migrations exist in the `migrations` list.
for (const executed of executedMigrations) {
if (!migrations.some((it) => it.name === executed)) {
throw new Error(`corrupted migrations: previously executed migration ${executed} is missing`);
}
}
}
#ensureMigrationsInOrder(migrations, executedMigrations) {
// Ensure the executed migrations are the first ones in the migration list.
for (let i = 0; i < executedMigrations.length; ++i) {
if (migrations[i].name !== executedMigrations[i]) {
throw new Error(`corrupted migrations: expected previously executed migration ${executedMigrations[i]} to be at index ${i} but ${migrations[i].name} was found in its place. New migrations must always have a name that comes alphabetically after the last executed migration.`);
}
}
}
async #migrateDown(db, state, step) {
const migrationsToRollback = state.executedMigrations
.slice()
.reverse()
.slice(0, step)
.map((name) => {
return state.migrations.find((it) => it.name === name);
});
const results = migrationsToRollback.map((migration) => {
return {
migrationName: migration.name,
direction: 'Down',
status: 'NotExecuted',
};
});
for (let i = 0; i < results.length; ++i) {
const migration = migrationsToRollback[i];
try {
if (migration.down) {
await migration.down(db);
await db
.withPlugin(this.#schemaPlugin)
.deleteFrom(this.#migrationTable)
.where('name', '=', migration.name)
.execute();
results[i] = {
migrationName: migration.name,
direction: 'Down',
status: 'Success',
};
}
}
catch (error) {
results[i] = {
migrationName: migration.name,
direction: 'Down',
status: 'Error',
};
throw new MigrationResultSetError({
error,
results,
});
}
}
return { results };
}
async #migrateUp(db, state, step) {
const migrationsToRun = state.pendingMigrations.slice(0, step);
const results = migrationsToRun.map((migration) => {
return {
migrationName: migration.name,
direction: 'Up',
status: 'NotExecuted',
};
});
for (let i = 0; i < results.length; i++) {
const migration = state.pendingMigrations[i];
try {
await migration.up(db);
await db
.withPlugin(this.#schemaPlugin)
.insertInto(this.#migrationTable)
.values({
name: migration.name,
timestamp: new Date().toISOString(),
})
.execute();
results[i] = {
migrationName: migration.name,
direction: 'Up',
status: 'Success',
};
}
catch (error) {
results[i] = {
migrationName: migration.name,
direction: 'Up',
status: 'Error',
};
throw new MigrationResultSetError({
error,
results,
});
}
}
return { results };
}
async #createIfNotExists(qb) {
if (this.#props.db.getExecutor().adapter.supportsCreateIfNotExists) {
qb = qb.ifNotExists();
}
await qb.execute();
}
}
exports.Migrator = Migrator;
class MigrationResultSetError extends Error {
#resultSet;
constructor(result) {
super();
this.#resultSet = result;
}
get resultSet() {
return this.#resultSet;
}
}

Some files were not shown because too many files have changed in this diff Show More