refactor: update exporting and testing

This commit is contained in:
xiaowei 2021-01-05 19:13:26 +08:00
commit bc888b7339
36 changed files with 1359 additions and 96 deletions

View File

@ -7,6 +7,12 @@ module.exports = {
'google',
],
'globals': {
'expect': 'readable',
'test': 'readable',
'describe': 'readable',
'beforeEach': 'readable',
'afterEach': 'readable',
'jest': 'readable',
'Atomics': 'readonly',
'SharedArrayBuffer': 'readonly',
},

View File

@ -9,7 +9,7 @@ jobs:
strategy:
matrix:
node-version: [8.x, 10.x, 12.x]
node-version: [10.x, 12.x, 14.x]
steps:
- uses: actions/checkout@v1
@ -17,9 +17,10 @@ jobs:
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- name: npm install, build, and test
- name: npm install, test, and build
run: |
npm install
npm test
npm run build
env:
CI: true

View File

@ -2,6 +2,46 @@
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
## 4.0.0-beta.1 (2021-01-05)
### Features
* **fink:** refactoring selectStatement ([d7d32a3](https://github.com/DTStack/dt-sql-parser/commit/d7d32a382404df8917282d835134f50b1f3a6eff))
* **flin:** add split sql test ([3054e90](https://github.com/DTStack/dt-sql-parser/commit/3054e909176ee09333e9686e53f767c07c52693e))
* **flink:** add createTable grammar ([b7df08f](https://github.com/DTStack/dt-sql-parser/commit/b7df08f01287e7ded40581e85d79cc13a5ad794f))
* **flink:** add describe/explain/use/show statement and some test ([0ef8069](https://github.com/DTStack/dt-sql-parser/commit/0ef80696f49d26423d98642b82a60cc038c3d8ed))
* **flink:** add drop/alter test add add part of queryStatement ([9fc91a5](https://github.com/DTStack/dt-sql-parser/commit/9fc91a572af11843c801ca7749818a04d67039d2))
* **flink:** add grammar rules that pass the test ([afef8e6](https://github.com/DTStack/dt-sql-parser/commit/afef8e6d72533df170e5e862fd2a31708a55a52d))
* **flink:** add inset\drop\alter grammar ([158e235](https://github.com/DTStack/dt-sql-parser/commit/158e235b012d7ef263b92f8726b4235596b0c5b2))
* **flink:** add performance test ([cc1d5ab](https://github.com/DTStack/dt-sql-parser/commit/cc1d5abcdd2e1ccc9d0a383d933b9296b6f64333))
* **flink:** add queryStatement ([ba29949](https://github.com/DTStack/dt-sql-parser/commit/ba29949359325ca2b329d0e70a6ebfb431810fa8))
* **flink:** adjust lexer position to fix test error ([da9660c](https://github.com/DTStack/dt-sql-parser/commit/da9660c6fe7c9a8654bec89edf718cd38c160898))
* **flink:** merge parser and lexer in order to java test ([0735269](https://github.com/DTStack/dt-sql-parser/commit/0735269f4e641235188af461bd5df5cb416c9828))
* **flink:** modify grammar to match keywords ([e67f991](https://github.com/DTStack/dt-sql-parser/commit/e67f991ede461b847e6a3daa2bf71a00dc739d88))
* **flink:** perfect query statement ([1b9efdc](https://github.com/DTStack/dt-sql-parser/commit/1b9efdccd54ecf863bafb4192d8c294e86a5d8e1))
* **flink:** update createTable grammar ([d1259b4](https://github.com/DTStack/dt-sql-parser/commit/d1259b46a065d4b30fca2612b1146dbd040b03bb))
* **flink:** update flink SQL grammar ([04c4c40](https://github.com/DTStack/dt-sql-parser/commit/04c4c4039770bf351f067f6193d7f6ab0720a524))
* **flink:** update flinkSql grammar and add some test ([c9d88d0](https://github.com/DTStack/dt-sql-parser/commit/c9d88d00a606c7130415ab3db35f088ec0cecac1))
* **flink:** update grammar to match special columnName ([a9b1e90](https://github.com/DTStack/dt-sql-parser/commit/a9b1e90d73a733e47ea108f47790fe148fb6fa20))
* **utils:** add cleanSql、splitSql、lexer func and test ([7d6c753](https://github.com/DTStack/dt-sql-parser/commit/7d6c753d824cfb8e3808132356a2c78bda81795c))
* add generic and plsql basic parser file ([f851638](https://github.com/DTStack/dt-sql-parser/commit/f85163892a1b5249bbe73162cfc515100765fa22))
* add some query grammar ([a5ea7be](https://github.com/DTStack/dt-sql-parser/commit/a5ea7be6069e239ac20f69ffa1cc9f0c043b8dc3))
* create hive lexer and hive parser ([ec41432](https://github.com/DTStack/dt-sql-parser/commit/ec41432ee300f9b00306aaf1cecc95d02afe0302))
* refactoring hive SQL lexer ([557e9a3](https://github.com/DTStack/dt-sql-parser/commit/557e9a32466f6f899e809bd37927e091052033d9))
* **flinksql:** add some lexer ([6082c2b](https://github.com/DTStack/dt-sql-parser/commit/6082c2b151960189f3ff27a8b76e033b22b53561))
### Bug Fixes
* adjust import path ([e7e0c15](https://github.com/DTStack/dt-sql-parser/commit/e7e0c15b0d60095fbe85a37e7a60836d7fa34396))
* delete mock data ([da25815](https://github.com/DTStack/dt-sql-parser/commit/da2581533fa7a8196710c6046a03f57d476fd090))
* jest command ([76675e8](https://github.com/DTStack/dt-sql-parser/commit/76675e8251d272f43421e362d200ea7df4caca8e))
* lock antlr version ([d9c0928](https://github.com/DTStack/dt-sql-parser/commit/d9c0928f7a3a7367944523767fdc758dbdeb1268))
* ts problem ([6b16f75](https://github.com/DTStack/dt-sql-parser/commit/6b16f752e40e4170b6a92c37a67ed330fe9ab100))
* **flink:** clear useless comments ([771b562](https://github.com/DTStack/dt-sql-parser/commit/771b562c7893d89002b29cfeae9d2fbe0e8ee8d6))
* restore antlr4 config ([504f6df](https://github.com/DTStack/dt-sql-parser/commit/504f6df2ec8415a7c4a5fce1478d87f9ed5f4dd1))
### [4.0.1-beta](https://github.com/DTStack/dt-sql-parser/compare/v4.0.0-beta...v4.0.1-beta) (2021-01-05)

View File

@ -8,6 +8,20 @@
## Integrate with Monaco Editor
## Release
npm run script
```bash
npm run release -- --release-as minor
```
Or
```bash
npm run release -- --release-as 1.1.0
```
## Reference
- <https://tomassetti.me/writing-a-browser-based-editor-using-monaco-and-antlr/>

View File

@ -1,7 +1,7 @@
{
"name": "dt-sql-parser",
"version": "4.0.1-beta",
"description": "There are some sql parsers built with antlr4, and it's mainly for the **BigData** domain.",
"version": "4.0.0-beta.1",
"description": "SQL Parsers for BigData, built with antlr4",
"keywords": [
"sql",
"parser",
@ -18,7 +18,8 @@
"build": "rm -rf dist && tsc",
"eslint": "eslint ./src/**/*.ts",
"check-types": "tsc --skipLibCheck",
"test": "jest"
"test": "jest",
"release": "npm run build && standard-version --infile CHANGELOG.md"
},
"author": "dt-insight-front",
"license": "MIT",
@ -39,6 +40,7 @@
},
"dependencies": {
"@types/antlr4": "4.7.0",
"antlr4": "4.7.2"
"antlr4": "4.7.2",
"standard-version": "^9.1.0"
}
}

View File

@ -1,2 +1,13 @@
export * from './parser';
export * from './utils';
export * from './lib/flinksql/FlinkSqlParserListener';
export * from './lib/flinksql/FlinkSqlParserVisitor';
export * from './lib/generic/SqlParserVisitor';
export * from './lib/generic/SqlParserListener';
export * from './lib/hive/HiveSqlListener';
export * from './lib/hive/HiveSqlVisitor';
export * from './lib/plsql/PlSqlParserListener';
export * from './lib/plsql/PlSqlParserVisitor';
export * from './lib/spark/SparkSqlVisitor';
export * from './lib/spark/SparkSqlListener';

View File

@ -1,10 +1,7 @@
import { InputStream, CommonTokenStream, Lexer } from 'antlr4';
import { FlinkSqlLexer } from '../lib/flinksql/FlinkSqlLexer';
import { FlinkSqlParser } from '../lib/flinksql/FlinkSqlParser';
export * from '../lib/flinksql/FlinkSqlParserListener';
export * from '../lib/flinksql/FlinkSqlParserVisitor';
import BasicParser from './common/BasicParser';
import BasicParser from './common/basicParser';
export default class FlinkSQL extends BasicParser {
public createLexer(input: string): Lexer {

View File

@ -1,10 +1,7 @@
import { InputStream, CommonTokenStream, Lexer } from 'antlr4';
import { SqlLexer } from '../lib/generic/SqlLexer';
import { SqlParser } from '../lib/generic/SqlParser';
export * from '../lib/generic/SqlParserVisitor';
export * from '../lib/generic/SqlParserListener';
import BasicParser from './common/BasicParser';
import BasicParser from './common/basicParser';
export default class GenericSQL extends BasicParser {
public createLexer(input: string): Lexer {

View File

@ -1,10 +1,7 @@
import { InputStream, CommonTokenStream, Lexer } from 'antlr4';
import { HiveSqlLexer } from '../lib/hive/HiveSqlLexer';
import { HiveSql } from '../lib/hive/HiveSql';
export * from '../lib/hive/HiveSqlListener';
export * from '../lib/hive/HiveSqlVisitor';
import BasicParser from './common/BasicParser';
import BasicParser from './common/basicParser';
export default class HiveSQL extends BasicParser {
public createLexer(input: string): Lexer {

View File

@ -1,8 +1,3 @@
export * from './generic';
export * from './plsql';
export * from './hive';
export * from './flinksql';
export * from './spark';
export { default as GenericSQL } from './generic';
export { default as PLSQL } from './plsql';
export { default as HiveSQL } from './hive';

View File

@ -1,10 +1,8 @@
import { InputStream, CommonTokenStream, Lexer } from 'antlr4';
import { PlSqlLexer } from '../lib/plsql/PlSqlLexer';
import { PlSqlParser } from '../lib/plsql/PlSqlParser';
export * from '../lib/plsql/PlSqlParserListener';
export * from '../lib/plsql/PlSqlParserVisitor';
import BasicParser from './common/BasicParser';
import BasicParser from './common/basicParser';
export default class PLSQLParser extends BasicParser {
public createLexer(input: string): Lexer {

View File

@ -1,10 +1,7 @@
import { InputStream, CommonTokenStream, Lexer } from 'antlr4';
import { SparkSqlLexer } from '../lib/spark/SparkSqlLexer';
import { SparkSqlParser } from '../lib/spark/SparkSqlParser';
export * from '../lib/spark/SparkSqlVisitor';
export * from '../lib/spark/SparkSqlListener';
import BasicParser from './common/BasicParser';
import BasicParser from './common/basicParser';
export default class SparkSQL extends BasicParser {
public createLexer(input: string): Lexer {

View File

@ -1,6 +1,7 @@
import SQLParser from '../../../src/parser/flinksql';
import { FlinkSQL } from '../../../src';
describe('FlinkSQL Lexer tests', () => {
const parser = new SQLParser();
const parser = new FlinkSQL();
const sql = 'SELECT * FROM table1';
const tokens = parser.getAllTokens(sql);

View File

@ -1,11 +1,9 @@
import
SQLParser, { FlinkSqlParserListener }
from '../../../src/parser/flinksql';
import { FlinkSQL, FlinkSqlParserListener } from '../../../src';
describe('Flink SQL Listener Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new FlinkSQL();
const parserTree = parser.parse(sql);

View File

@ -1,7 +1,7 @@
import SQLParser from '../../../src/parser/flinksql';
import { FlinkSQL } from '../../../src';
describe('FlinkSQL Syntax Tests', () => {
const parser = new SQLParser();
const parser = new FlinkSQL();
// Create statements
test('Test simple CreateTable Statement', () => {

View File

@ -1,9 +1,9 @@
import SQLParser, { FlinkSqlParserVisitor } from '../../../src/parser/flinksql';
import { FlinkSQL, FlinkSqlParserVisitor } from '../../../src';
describe('Flink SQL Visitor Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new FlinkSQL();
const parserTree = parser.parse(sql, (error) => {
console.log('Parse error:', error);

View File

@ -1,7 +1,7 @@
import SQLParser from '../../../src/parser/generic';
import { GenericSQL } from '../../../src/';
describe('GenericSQL Lexer tests', () => {
const mysqlParser = new SQLParser();
const mysqlParser = new GenericSQL();
const sql = 'select id,name,sex from user1;';
const tokens = mysqlParser.getAllTokens(sql);

View File

@ -1,9 +1,9 @@
import SQLParser, { SqlParserListener } from '../../../src/parser/generic';
import { GenericSQL, SqlParserListener } from '../../../src';
describe('Generic SQL Listener Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new GenericSQL();
const parserTree = parser.parse(sql);

View File

@ -1,7 +1,7 @@
import SQLParser from '../../../src/parser/generic';
import { GenericSQL } from '../../../src';
describe('Generic SQL Syntax Tests', () => {
const parser = new SQLParser();
const parser = new GenericSQL();
test('Select Statement', () => {
const sql = 'select id,name from user1;';

View File

@ -1,9 +1,9 @@
import SQLParser, { SqlParserVisitor } from '../../../src/parser/generic';
import { GenericSQL, SqlParserVisitor } from '../../../src';
describe('Generic SQL Visitor Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new GenericSQL();
const parserTree = parser.parse(sql, (error) => {
console.log('Parse error:', error);

View File

@ -1,7 +1,7 @@
import SQLParser from '../../../src/parser/hive';
import { HiveSQL } from '../../../src';
describe('HiveSQL Lexer tests', () => {
const parser = new SQLParser();
const parser = new HiveSQL();
test('select token counts', () => {
const sql = 'SELECT * FROM t1';
const tokens = parser.getAllTokens(sql);

View File

@ -1,7 +1,7 @@
import SQLParser, { HiveSqlListener } from '../../../src/parser/hive';
import { HiveSQL, HiveSqlListener } from '../../../src';
describe('Hive SQL Listener Tests', () => {
const parser = new SQLParser();
const parser = new HiveSQL();
test('Listener enterSelectList', async () => {
const expectTableName = 'userName';
const sql = `select ${expectTableName} from user1 where inc_day='20190601' limit 1000;`;

View File

@ -1,7 +1,7 @@
import SQLParser from '../../../src/parser/hive';
import { HiveSQL } from '../../../src';
describe('Hive SQL Syntax Tests', () => {
const parser = new SQLParser();
const parser = new HiveSQL();
test('Create Table Statement', () => {
const sql = 'CREATE TABLE person(name STRING,age INT);';
const result = parser.validate(sql);
@ -13,10 +13,10 @@ describe('Hive SQL Syntax Tests', () => {
expect(result.length).toBe(0);
});
test('Wrong Select Statement', () => {
const sql = 'SELECT add ABC from Where ;'
const sql = 'SELECT add ABC from Where ;';
const result = parser.validate(sql);
expect(result.length).toBe(2);
expect(result[0].message).toBe(`no viable alternative at input 'SELECTaddABCfromWhere'`)
expect(result[1].message).toBe(`mismatched input 'Where' expecting <EOF>`)
expect(result[0].message).toBe(`no viable alternative at input 'SELECTaddABCfromWhere'`);
expect(result[1].message).toBe(`mismatched input 'Where' expecting <EOF>`);
});
});

View File

@ -1,9 +1,9 @@
import SQLParser, { HiveSqlVisitor } from '../../../src/parser/hive';
import { HiveSQL, HiveSqlVisitor } from '../../../src';
describe('Generic SQL Visitor Tests', () => {
const expectTableName = 'dm_gis.dlv_addr_tc_count';
const sql = `select citycode,tc,inc_day from ${expectTableName} where inc_day='20190501' limit 100;`;
const parser = new SQLParser();
const parser = new HiveSQL();
const parserTree = parser.parse(sql, (error) => {
console.log('Parse error:', error);

View File

@ -1,7 +1,7 @@
import SQLParser from '../../../src/parser/plsql';
import { PLSQL } from '../../../src';
describe('PLSQL Lexer tests', () => {
const parser = new SQLParser();
const parser = new PLSQL();
const sql = 'select id,name,sex from user1;';
const tokens = parser.getAllTokens(sql);

View File

@ -1,9 +1,9 @@
import SQLParser, { PlSqlParserListener } from '../../../src/parser/plsql';
import { PLSQL, PlSqlParserListener } from '../../../src';
describe('PLSQL Listener Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new PLSQL();
const parserTree = parser.parse(sql);

View File

@ -1,7 +1,7 @@
import SQLParser from '../../../src/parser/plsql';
import { PLSQL } from '../../../src';
describe('PLSQL Syntax Tests', () => {
const parser = new SQLParser();
const parser = new PLSQL();
test('Test simple select Statement', () => {
const sql = 'select id,name from user1;';

View File

@ -1,9 +1,9 @@
import SQLParser, { PlSqlParserVisitor } from '../../../src/parser/plsql';
import { PLSQL, PlSqlParserVisitor } from '../../../src';
describe('PLSQL Visitor Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new PLSQL();
const parserTree = parser.parse(sql);

View File

@ -1,9 +1,9 @@
import SQLParser from '../../../src/parser/spark';
import { SparkSQL } from '../../../src';
const log = console.log.bind(console);
describe('SparkSQL Lexer tests', () => {
const parser = new SQLParser();
const parser = new SparkSQL();
test('select id,name from user1;', () => {
const sql = `select id,name from user1;`;

View File

@ -1,9 +1,9 @@
import SQLParser, { SparkSqlListener } from '../../../src/parser/spark';
import { SparkSQL, SparkSqlListener } from '../../../src';
describe('Spark SQL Listener Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new SparkSQL();
const parserTree = parser.parse(sql);

View File

@ -1,10 +1,9 @@
/* eslint-disable max-len */
import SQLParser from '../../../src/parser/spark';
import { SparkSQL } from '../../../src';
const error = console.log.bind(console, '***** error\n');
const validateTest = (sqls) => {
const parser = new SQLParser();
const parser = new SparkSQL();
sqls.forEach((sql, i) => {
const result = parser.validate(sql);
if (result.length !== 0) {

View File

@ -1,9 +1,9 @@
import SQLParser, { SparkSqlVisitor } from '../../../src/parser/spark';
import { SparkSQL, SparkSqlVisitor } from '../../../src';
describe('Spark SQL Visitor Tests', () => {
const expectTableName = 'user1';
const sql = `select id,name,sex from ${expectTableName};`;
const parser = new SQLParser();
const parser = new SparkSQL();
const parserTree = parser.parse(sql, (error) => {
console.log('Parse error:', error);

View File

@ -1,8 +1,9 @@
import * as utils from '../../src/utils';
import { lexer, splitSql, cleanSql } from '../../src';
describe('utils', () => {
test('split single sql', () => {
const sql = 'select id,name from user';
const result = utils.splitSql(sql);
const result = splitSql(sql);
expect(result.length).toEqual(1);
});
test('split multiple sql', () => {
@ -13,7 +14,7 @@ describe('utils', () => {
xxx
*/
select user from b`;
const result = utils.splitSql(sql);
const result = splitSql(sql);
expect(result.length).toEqual(2);
});
test('lexer', () => {
@ -24,7 +25,7 @@ describe('utils', () => {
xxx
*/
select user from b;`;
const result = utils.lexer(sql);
const result = lexer(sql);
expect(result.length).toEqual(4);
});
test('cleanSql', () => {
@ -35,7 +36,7 @@ describe('utils', () => {
xxx
*/
select user from b`;
const result = utils.cleanSql(sql);
const result = cleanSql(sql);
expect(result.indexOf('xxx')).toEqual(-1);
});
});

View File

@ -1,12 +1,13 @@
{
"compilerOptions": {
"outDir": "./dist/",
"sourceMap": false,
"sourceMap": true,
"allowJs":true,
"target": "es6",
"module": "commonjs",
"noUnusedLocals": true,
"noUnusedParameters": false,
"skipLibCheck": true,
"typeRoots": [
"node",
"node_modules/@types",

1234
yarn.lock

File diff suppressed because it is too large Load Diff