feat: init the project

This commit is contained in:
wangyu 2022-09-06 16:13:01 +08:00
commit bd1dfdca65
45 changed files with 3358 additions and 0 deletions

5
.gitignore vendored Normal file
View File

@ -0,0 +1,5 @@
.idea
*.iml
.DS_Store
target

201
LICENSE Normal file
View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

107
README.md Normal file
View File

@ -0,0 +1,107 @@
# fluent-sql
基于Fluent Api实现的SQL构建器秒杀mybatis plus的存在易用性的API让你爽到飞起。
## 特性介绍
1. 实现了SQL底层的语法解析创新使用片段的构建模式忽略嵌套层级任意嵌套
2. 高质量的代码,没有一点点冗余的设计,为您的代码保驾护航
3. 天生自带防SQL注入和条件空策略解析后续会增加更加精细的配置
4. 支持任意多表的关联查询和数据绑定
5. 支持返回实体映射,基于注解式开发,更解耦,更面向对象
6. 智能别名策略写查询再也不用担心多张表的别名问题代码简介易懂用java跟sql体验直接拉满
7. 高精度api控制sql构建每个步骤严格把关保证输入一个api立即能写出来接下来的步骤还不出错
## 快速使用
本小组件主要解决的是sql的书写问题旨在用更加优雅的方式实现sql并且不用再担心数据库方言SQL Dialect
变化导致的频繁变更SQL问题。
如果要实现下面一段SQL
```sql
SELECT t1.`id` AS `id`,
t1.`name` AS `name`,
t1.`identifier` AS `identifier`,
`max_tenant` AS `maxTenant`,
t2.`related_id` AS `relatedId`
FROM saas_tenant `t1`
INNER JOIN saas_quota `t2` ON t2.`related_id` = t1.`id`
AND t1.`identifier` = ?
WHERE t1.`id` = ?
AND t1.`name` LIKE CONCAT(%,?,%)
AND t2.`related_id` IN (?, ?)
AND t2.`related_type` = ?
ORDER BY t1.`create_time` DESC,
t1.`id` ASC
```
您只需要写以下java代码
```java
public class TestSql {
/**
* 执行sql测试
* @return 最终组装的实体
*/
public TenantContext executeSql() {
// 查询开始
return select(
// 某张表的几个字段
composite(SaasTenant::getId, SaasTenant::getName, SaasTenant::getIdentifier),
// 某张表的全量字段
all(SaasProperties.class),
// 其他表的字段
composite(SaasQuota::getId, SaasQuota::getRelatedType),
// 指定别名的字段
composite(SaasQuota::getRelatedId, "relatedOtherId"))
.from(SaasTenant.class)
.join(SaasQuota.class).on(where(SaasQuota::getRelatedId).eq(SaasTenant::getId)
.and(SaasTenant::getIdentifier).eq(5))
.join(SaasProperties.class)
.then()
.matching(where(SaasTenant::getId).eq("1")
.and(SaasTenant::getName).like("王大锤")
.and(SaasQuota::getRelatedId).in(Arrays.asList("5", "10"))
.and(SaasQuota::getRelatedType).eq(SaasQuota.RelatedType.TENANT)
)
.order(by(SaasTenant::getCreateTime).desc(), by(SaasTenant::getId).asc())
.one(TenantContext.class);
}
}
```
## 常规步骤
建议使用静态导入,美观代码,如下:
1. SQL.select => select
2. SelectComposite.composite => composite
3. SelectComposite.all => all
4. Order.by => by
5. Query.where => where
为了方便演示,我们下面的代码都基于静态导入函数:
1. 绑定您执行sql的数据源。如下
```java
class SQLConfig {
void doConfig() {
// 创建或者获取您的数据源
DataSource = createDataSource(...)
// 基于spring jdbc template实例化
SQL.bind(new JdbcTemplate(dataSource));
}
}
```
2. 写一个结果对象Vo用来接取执行结果如快速开始中的`TenantContext`
3. 用java写sql然后执行`.one()`或者`.list()`返回一条或者多条。
4. 查询单表所有字段,请使用`select().from(TableClass.class)`
5. 查询多表中某个单表的所有字段,请使用`select(all(TableClass.class)).from(TableClass.class).join(Other.class).then()`
6.
多张表中查询某些字段,并使用别名,请参考`select(composite(TableA::getName, "nameA"), composite(TableB::getName, "nameB")).from(TableA.class).join(TableB.class)...`

37
fluent-sql-core/pom.xml Normal file
View File

@ -0,0 +1,37 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>fluent-sql</artifactId>
<groupId>group.flyfish.framework</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>fluent-sql-core</artifactId>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
</dependency>
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
</dependencies>
</project>

View File

@ -0,0 +1,19 @@
package group.flyfish.fluent.binding;
import java.lang.annotation.*;
/**
* 结果映射指定别名
*
* @author wangyu
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface Alias {
/**
* @return 别名
*/
String value();
}

View File

@ -0,0 +1,14 @@
package group.flyfish.fluent.binding;
import java.lang.annotation.*;
/**
* json注入
*
* @author wangyu
*/
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface JSONInject {
}

View File

@ -0,0 +1,26 @@
package group.flyfish.fluent.chain;
import group.flyfish.fluent.utils.sql.SFunction;
/**
* 排序链式声明
*
* @author wangyu
*/
public interface Order extends SQLSegment {
/**
* 以字段排序
*
* @param field 字段
* @param <T> 泛型
* @return 链式调用
*/
static <T> Order by(SFunction<T, ?> field) {
return new OrderImpl(field);
}
Order asc();
Order desc();
}

View File

@ -0,0 +1,37 @@
package group.flyfish.fluent.chain;
import group.flyfish.fluent.utils.sql.SFunction;
import lombok.RequiredArgsConstructor;
/**
* 排序支持
*
* @author wangyu
*/
@RequiredArgsConstructor
final class OrderImpl implements Order, SQLSegment {
private final SFunction<?, ?> field;
private String order = "asc";
@Override
public OrderImpl asc() {
order = "asc";
return this;
}
@Override
public OrderImpl desc() {
order = "desc";
return this;
}
/**
* @return 得到sql片段
*/
@Override
public String get() {
return String.join(" ", field.getName(), order);
}
}

View File

@ -0,0 +1,64 @@
package group.flyfish.fluent.chain;
import group.flyfish.fluent.chain.common.PreSqlChain;
import group.flyfish.fluent.chain.select.SelectComposite;
import group.flyfish.fluent.update.Update;
import group.flyfish.fluent.utils.sql.SFunction;
import org.springframework.jdbc.core.JdbcOperations;
import static group.flyfish.fluent.utils.sql.SqlNameUtils.cast;
/**
* 链式查询入口
*
* @author wangyu
*/
public interface SQL {
/**
* 查询起手式
*
* @return 自身
*/
@SafeVarargs
static <T> PreSqlChain select(SFunction<T, ?>... fields) {
return SQLFactory.produce().select(fields);
}
/**
* 查询起手式
*
* @return 自身
*/
static PreSqlChain select(SelectComposite<?>... composites) {
return SQLFactory.produce().select(SelectComposite.combine(composites));
}
/**
* 查询起手式查询全部字段
*
* @return 结果
*/
static PreSqlChain select() {
return SQLFactory.produce().select(cast(new SFunction[]{}));
}
/**
* 更新表起手
*
* @param clazz
* @return 更新链式
*/
static Update update(Class<?> clazz) {
return SQLFactory.produce().update(clazz);
}
/**
* 绑定数据源上下文基于jdbc template
*
* @param operations jdbc操作
*/
static void bind(JdbcOperations operations) {
SQLImpl.bind(operations);
}
}

View File

@ -0,0 +1,19 @@
package group.flyfish.fluent.chain;
/**
* sql工厂
*
* @author wangyu
* 普通静态工厂用于生产实现实例
*/
public interface SQLFactory {
/**
* 生产实例
*
* @return sql操作
*/
static SQLOperations produce() {
return new SQLImpl();
}
}

View File

@ -0,0 +1,255 @@
package group.flyfish.fluent.chain;
import group.flyfish.fluent.chain.common.AfterJoinSqlChain;
import group.flyfish.fluent.chain.common.HandleSqlChain;
import group.flyfish.fluent.chain.common.PreSqlChain;
import group.flyfish.fluent.chain.select.AfterOrderSqlChain;
import group.flyfish.fluent.chain.select.AfterWhereSqlChain;
import group.flyfish.fluent.chain.update.AfterSetSqlChain;
import group.flyfish.fluent.mapping.SQLMappedRowMapper;
import group.flyfish.fluent.query.JoinCandidate;
import group.flyfish.fluent.query.Parameterized;
import group.flyfish.fluent.query.Query;
import group.flyfish.fluent.update.Update;
import group.flyfish.fluent.update.UpdateImpl;
import group.flyfish.fluent.utils.context.AliasComposite;
import group.flyfish.fluent.utils.data.ParameterUtils;
import group.flyfish.fluent.utils.sql.ConcatSegment;
import group.flyfish.fluent.utils.sql.EntityNameUtils;
import group.flyfish.fluent.utils.sql.SFunction;
import group.flyfish.fluent.utils.sql.SqlNameUtils;
import org.springframework.dao.EmptyResultDataAccessException;
import org.springframework.jdbc.core.JdbcOperations;
import org.springframework.util.Assert;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;
/**
* 查询工具类
*
* @author wangyu
*/
final class SQLImpl extends ConcatSegment<SQLImpl> implements SQLOperations, PreSqlChain, HandleSqlChain, AfterJoinSqlChain, AfterSetSqlChain {
// 共享的操作
private static JdbcOperations SHARED_OPERATIONS;
// 参数map有序
private final List<Object> parameters = new ArrayList<>();
// 调试标识
private final boolean debug = false;
/**
* 基于jdbc template
*
* @param operations jdbc操作
*/
public static void bind(JdbcOperations operations) {
SHARED_OPERATIONS = operations;
}
/**
* 查询起手
*
* @param fields 字段列表不传代表所有字段
* @return 预查询链
*/
@SafeVarargs
@Override
public final <T> PreSqlChain select(SFunction<T, ?>... fields) {
String linker = !segments.isEmpty() ? "," : "SELECT";
if (fields != null && fields.length != 0) {
return this.concat(linker)
.concat(() -> Arrays.stream(fields).map(SFunction::getSelect).collect(Collectors.joining(",")));
}
return this.concat(linker).concat("*");
}
/**
* 更新起手
*
* @param clazz 具体表
* @return 链式调用
*/
@Override
public <T> Update update(Class<T> clazz) {
return new UpdateImpl(update -> {
if (withoutParameter(update)) return this;
return this.concat("UPDATE")
.concat(() -> EntityNameUtils.getTableName(clazz))
.concat("SET")
.concat(update);
});
}
/**
* 从表里查
*
* @param type 类型
* @return 链式调用
*/
public HandleSqlChain from(Class<?> type) {
return from(type, null);
}
/**
* 从指定表查询附加别名
* 该接口适用于同一张表多次from的情况可以从自表进行多次查询
* 大部分情况下您都不需要指定别名
*
* @param type 类型
* @param alias 别名
* @return 处理环节
*/
@Override
public HandleSqlChain from(Class<?> type, String alias) {
String mapped = AliasComposite.add(type, alias);
return concat("FROM")
.concat(() -> EntityNameUtils.getTableName(type))
.concat(() -> SqlNameUtils.wrap(mapped));
}
/**
* 全量的join连接支持
*
* @param type 连接类型
* @param clazz 目标表
* @param alias 别名
* @return join后的操作
*/
@Override
public AfterJoinSqlChain join(JoinCandidate type, Class<?> clazz, String alias) {
String mapped = AliasComposite.add(clazz, alias);
return concat(type)
.concat(() -> EntityNameUtils.getTableName(clazz))
.concat(() -> SqlNameUtils.wrap(mapped));
}
/**
* 连接条件
*
* @param query 查询
* @return 处理链
*/
@Override
public HandleSqlChain on(Query query) {
if (withoutParameter(query)) return this;
return concat("ON").concat(query);
}
/**
* 不带连接条件
*
* @return 处理链
*/
@Override
public HandleSqlChain then() {
return this;
}
/**
* 拼接查询条件
*
* @param query 条件
* @return 结果
*/
public AfterWhereSqlChain matching(Query query) {
if (withoutParameter(query)) return this;
return concat("WHERE").concat(query);
}
/**
* 拼接排序条件
*
* @param orders 排序
* @return 结果
*/
@Override
public AfterOrderSqlChain order(Order... orders) {
if (null != orders && orders.length != 0) {
return concat("ORDER BY")
.concat(() -> Arrays.stream(orders).map(SQLSegment::get).collect(Collectors.joining(",")));
}
return this;
}
/**
* 执行并获取结果
*
* @param clazz 结果类
* @param <T> 泛型
*/
public <T> T one(Class<T> clazz) {
try {
return SHARED_OPERATIONS.queryForObject(sql().concat(" limit 1"),
new SQLMappedRowMapper<>(clazz), parsedParameters());
} catch (EmptyResultDataAccessException e) {
return null;
}
}
/**
* 执行并获取多条结果
*
* @param clazz 结果类
* @return 结果列表
*/
@Override
public <T> List<T> list(Class<T> clazz) {
return SHARED_OPERATIONS.query(sql(), new SQLMappedRowMapper<>(clazz), parsedParameters());
}
/**
* 执行并获取更新条数
*
* @return 更新条数
*/
@Override
public int execute() {
return SHARED_OPERATIONS.update(sql(), parsedParameters());
}
/**
* 构建sql
*
* @return 构建结果
*/
private String sql() {
Assert.notNull(SHARED_OPERATIONS, "未指定执行数据源!");
String sql = segments.stream().map(SQLSegment::get).collect(Collectors.joining(" "));
if (debug) {
System.out.println("prepared sql: " + sql);
System.out.println("prepared args:" + parameters.stream().map(ParameterUtils::convert).map(String::valueOf)
.collect(Collectors.joining(",")));
}
AliasComposite.flush();
return sql;
}
/**
* 解析后的参数
*
* @return 结果
*/
private Object[] parsedParameters() {
return parameters.stream().map(ParameterUtils::convert).toArray();
}
/**
* 没有参数值
*
* @param params 参数
* @return 为true代表没有参数不添加该片段
*/
private boolean withoutParameter(Parameterized params) {
if (params.isEmpty() || null == params.getParameters()) {
return true;
}
parameters.addAll(params.getParameters());
return false;
}
}

View File

@ -0,0 +1,30 @@
package group.flyfish.fluent.chain;
import group.flyfish.fluent.chain.common.PreSqlChain;
import group.flyfish.fluent.update.Update;
import group.flyfish.fluent.utils.sql.SFunction;
/**
* SQL操作
*/
public interface SQLOperations {
/**
* 查询起手
*
* @param fields 字段列表不传代表所有字段
* @param <T> 实体泛型
* @return 预查询链
*/
@SuppressWarnings("unchecked")
<T> PreSqlChain select(SFunction<T, ?>... fields);
/**
* 更新起手
*
* @param clazz 具体表
* @param <T> 泛型
* @return 链式调用
*/
<T> Update update(Class<T> clazz);
}

View File

@ -0,0 +1,29 @@
package group.flyfish.fluent.chain;
import group.flyfish.fluent.utils.sql.SqlNameUtils;
/**
* sql片段
*
* @author wangyu
*/
@FunctionalInterface
public interface SQLSegment {
/**
* @return 得到sql片段
*/
String get();
/**
* 类型强转请慎用除非你知道真实类型
*
* @param value 任意类型值
* @param <T> 入参泛型
* @param <R> 出参泛型
* @return 转换类型值
*/
default <T, R> R cast(T value) {
return SqlNameUtils.cast(value);
}
}

View File

@ -0,0 +1,26 @@
package group.flyfish.fluent.chain.common;
import group.flyfish.fluent.query.Query;
/**
* join连接后可执行的操作
*
* @author wangyu
*/
public interface AfterJoinSqlChain {
/**
* 连接条件
*
* @param query 查询
* @return 处理链
*/
HandleSqlChain on(Query query);
/**
* 不带连接条件
*
* @return 处理链
*/
HandleSqlChain then();
}

View File

@ -0,0 +1,16 @@
package group.flyfish.fluent.chain.common;
/**
* 可执行的sql
*
* @author wangyu
*/
public interface ExecutableSql {
/**
* 执行并获取更新条数
*
* @return 更新条数
*/
int execute();
}

View File

@ -0,0 +1,20 @@
package group.flyfish.fluent.chain.common;
import group.flyfish.fluent.chain.select.AfterWhereSqlChain;
import group.flyfish.fluent.query.Query;
/**
* 处理阶段的sql链
*
* @author wangyu
*/
public interface HandleSqlChain extends JoinOperations, AfterWhereSqlChain {
/**
* 拼接查询条件
*
* @param query 条件
* @return 结果
*/
AfterWhereSqlChain matching(Query query);
}

View File

@ -0,0 +1,84 @@
package group.flyfish.fluent.chain.common;
import group.flyfish.fluent.query.JoinCandidate;
/**
* 表连接操作
*
* @author wangyu
*/
public interface JoinOperations {
/**
* 全量的join连接支持
*
* @param type 连接类型
* @param clazz 目标表
* @param alias 别名
* @return join后的操作
*/
AfterJoinSqlChain join(JoinCandidate type, Class<?> clazz, String alias);
/**
* 使用内连接连接其他表
*
* @param clazz 其他表实体
* @return join后的操作
*/
default AfterJoinSqlChain join(Class<?> clazz) {
return join(JoinCandidate.INNER_JOIN, clazz, null);
}
/**
* 使用内连接连接其他表
*
* @param clazz 其他表实体
* @param alias 别名
* @return join后的操作
*/
default AfterJoinSqlChain join(Class<?> clazz, String alias) {
return join(JoinCandidate.INNER_JOIN, clazz, alias);
}
/**
* 使用内连接连接其他表
*
* @param clazz 其他表实体
* @return join后的操作
*/
default AfterJoinSqlChain leftJoin(Class<?> clazz) {
return join(JoinCandidate.LEFT_JOIN, clazz, null);
}
/**
* 使用内连接连接其他表
*
* @param clazz 其他表实体
* @param alias 别名
* @return join后的操作
*/
default AfterJoinSqlChain leftJoin(Class<?> clazz, String alias) {
return join(JoinCandidate.LEFT_JOIN, clazz, alias);
}
/**
* 使用内连接连接其他表
*
* @param clazz 其他表实体
* @return join后的操作
*/
default AfterJoinSqlChain rightJoin(Class<?> clazz) {
return join(JoinCandidate.RIGHT_JOIN, clazz, null);
}
/**
* 使用内连接连接其他表
*
* @param clazz 其他表实体
* @param alias 别名
* @return join后的操作
*/
default AfterJoinSqlChain rightJoin(Class<?> clazz, String alias) {
return join(JoinCandidate.RIGHT_JOIN, clazz, alias);
}
}

View File

@ -0,0 +1,26 @@
package group.flyfish.fluent.chain.common;
/**
* 刚设置了查询字段后的链
*/
public interface PreSqlChain {
/**
* 从指定表查询
*
* @param type 表实体
* @return 处理环节
*/
HandleSqlChain from(Class<?> type);
/**
* 从指定表查询附加别名
* 该接口适用于同一张表多次from的情况可以从自表进行多次查询
* 大部分情况下您都不需要指定别名
*
* @param type 类型
* @param alias 别名
* @return 处理环节
*/
HandleSqlChain from(Class<?> type, String alias);
}

View File

@ -0,0 +1,32 @@
package group.flyfish.fluent.chain.select;
import group.flyfish.fluent.chain.common.ExecutableSql;
import java.util.List;
/**
* order做完后支持的操作
*
* @author wangyu
*/
public interface AfterOrderSqlChain extends ExecutableSql {
/**
* 执行并获取结果
*
* @param clazz 结果类
* @param <T> 泛型
* @return 单一结果值
*/
<T> T one(Class<T> clazz);
/**
* 执行并获取多条结果
*
* @param clazz 结果类
* @param <T> 结果泛型
* @return 结果列表
*/
<T> List<T> list(Class<T> clazz);
}

View File

@ -0,0 +1,19 @@
package group.flyfish.fluent.chain.select;
import group.flyfish.fluent.chain.Order;
/**
* where条件后支持的操作
*
* @author wangyu
*/
public interface AfterWhereSqlChain extends AfterOrderSqlChain {
/**
* 拼接排序条件
*
* @param orders 排序
* @return 结果
*/
AfterOrderSqlChain order(Order... orders);
}

View File

@ -0,0 +1,73 @@
package group.flyfish.fluent.chain.select;
import group.flyfish.fluent.utils.context.AliasComposite;
import group.flyfish.fluent.utils.sql.EntityNameUtils;
import group.flyfish.fluent.utils.sql.SFunction;
import group.flyfish.fluent.utils.sql.SqlNameUtils;
import java.util.Arrays;
import java.util.stream.Stream;
/**
* 选择语句泛型包装
*
* @author wangyu
*/
@FunctionalInterface
public interface SelectComposite<T> {
/**
* 基于泛型的包装以实体类为T可以组合任意字段集合
*
* @param getter 字段属性getter
* @param <T> 实体泛型
* @return 包装集合
*/
@SafeVarargs
static <T> SelectComposite<T> composite(SFunction<T, Object>... getter) {
return () -> Arrays.stream(getter);
}
/**
* 基于单字段composite指定别名
*
* @param getter 字段属性getter
* @param alias 别名
* @param <T> 实体泛型
* @return 包装集合
*/
static <T> SelectComposite<T> composite(SFunction<T, Object> getter, String alias) {
AliasComposite.add(getter, alias);
return () -> Stream.of(getter);
}
/**
* 组合多个组合条件并进行解包忽略泛型
*
* @param composites 多个包
* @param <T> 泛型任意返回
* @return 扁平化的字段列表
*/
static <T> T combine(SelectComposite<?>... composites) {
return SqlNameUtils.cast(Arrays.stream(composites).flatMap(SelectComposite::stream).toArray(SFunction[]::new));
}
/**
* 查询表下面的所有字段
*
* @param clazz 实体类
* @param <T> 实体类泛型
* @return 字段集合
*/
static <T> SelectComposite<T> all(Class<T> clazz) {
return () -> EntityNameUtils.getFields(clazz).entrySet().stream()
.map(entry -> new SFunction.StaticRef<>(clazz, entry.getKey(), entry.getValue()));
}
/**
* 返回stream
*
* @return 结果流对象
*/
Stream<SFunction<T, Object>> stream();
}

View File

@ -0,0 +1,22 @@
package group.flyfish.fluent.chain.update;
import group.flyfish.fluent.chain.common.ExecutableSql;
import group.flyfish.fluent.chain.common.PreSqlChain;
import group.flyfish.fluent.chain.select.AfterWhereSqlChain;
import group.flyfish.fluent.query.Query;
/**
* set之后能干的事儿
*
* @author wangyu
*/
public interface AfterSetSqlChain extends PreSqlChain, ExecutableSql {
/**
* 查询条件
*
* @param query 查询
* @return 链式调用
*/
AfterWhereSqlChain matching(Query query);
}

View File

@ -0,0 +1,362 @@
package group.flyfish.fluent.mapping;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import group.flyfish.fluent.binding.Alias;
import group.flyfish.fluent.binding.JSONInject;
import group.flyfish.fluent.utils.data.ObjectMappers;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.*;
import org.springframework.core.annotation.MergedAnnotations;
import org.springframework.core.convert.ConversionService;
import org.springframework.core.convert.support.DefaultConversionService;
import org.springframework.dao.DataRetrievalFailureException;
import org.springframework.dao.InvalidDataAccessApiUsageException;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.jdbc.support.JdbcUtils;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import java.beans.PropertyDescriptor;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.util.HashMap;
import java.util.Locale;
import java.util.Map;
/**
* 基于SQL映射的行映射器
*
* @param <T> 响应实体泛型
* @author wangyu
*/
@Slf4j
public class SQLMappedRowMapper<T> implements RowMapper<T> {
private final ObjectMapper objectMapper = ObjectMappers.shared();
/**
* The class we are mapping to.
*/
@Nullable
private Class<T> mappedClass;
/**
* ConversionService for binding JDBC values to bean properties.
*/
@Nullable
private ConversionService conversionService = DefaultConversionService.getSharedInstance();
/**
* Map of the fields we provide mapping for.
*/
@Nullable
private Map<String, PropertyDescriptor> mappedFields;
/**
* Map of the fields which need json convert
*/
private Map<String, Class<?>> jsonFields;
/**
* Create a new {@code BeanPropertyRowMapper}, accepting unpopulated
* properties in the target bean.
*
* @param mappedClass the class that each row should be mapped to
*/
public SQLMappedRowMapper(Class<T> mappedClass) {
initialize(mappedClass);
}
/**
* Static factory method to create a new {@code BeanPropertyRowMapper}.
*
* @param mappedClass the class that each row should be mapped to
* @see #newInstance(Class, ConversionService)
*/
public static <T> SQLMappedRowMapper<T> newInstance(Class<T> mappedClass) {
return new SQLMappedRowMapper<>(mappedClass);
}
/**
* Static factory method to create a new {@code BeanPropertyRowMapper}.
*
* @param mappedClass the class that each row should be mapped to
* @param conversionService the {@link ConversionService} for binding
* JDBC values to bean properties, or {@code null} for none
* @see #newInstance(Class)
* @see #setConversionService
* @since 5.2.3
*/
public static <T> SQLMappedRowMapper<T> newInstance(
Class<T> mappedClass, @Nullable ConversionService conversionService) {
SQLMappedRowMapper<T> rowMapper = newInstance(mappedClass);
rowMapper.setConversionService(conversionService);
return rowMapper;
}
/**
* Get the class that we are mapping to.
*/
@Nullable
public final Class<T> getMappedClass() {
return this.mappedClass;
}
/**
* Set the class that each row should be mapped to.
*/
public void setMappedClass(Class<T> mappedClass) {
if (this.mappedClass == null) {
initialize(mappedClass);
} else {
if (this.mappedClass != mappedClass) {
throw new InvalidDataAccessApiUsageException("The mapped class can not be reassigned to map to " +
mappedClass + " since it is already providing mapping for " + this.mappedClass);
}
}
}
/**
* Return a {@link ConversionService} for binding JDBC values to bean properties,
* or {@code null} if none.
*
* @since 4.3
*/
@Nullable
public ConversionService getConversionService() {
return this.conversionService;
}
/**
* Set a {@link ConversionService} for binding JDBC values to bean properties,
* or {@code null} for none.
* <p>Default is a {@link DefaultConversionService}, as of Spring 4.3. This
* provides support for {@code java.time} conversion and other special types.
*
* @see #initBeanWrapper(BeanWrapper)
* @since 4.3
*/
public void setConversionService(@Nullable ConversionService conversionService) {
this.conversionService = conversionService;
}
/**
* Initialize the mapping meta-data for the given class.
*
* @param mappedClass the mapped class
*/
protected void initialize(Class<T> mappedClass) {
this.mappedClass = mappedClass;
this.mappedFields = new HashMap<>();
this.jsonFields = new HashMap<>();
Map<String, MergedAnnotations> fieldAnnotations = new HashMap<>();
ReflectionUtils.doWithFields(mappedClass, field -> fieldAnnotations.put(field.getName(), MergedAnnotations.from(field)));
for (PropertyDescriptor pd : BeanUtils.getPropertyDescriptors(mappedClass)) {
if (pd.getWriteMethod() != null) {
MergedAnnotations annotations = fieldAnnotations.get(pd.getName());
String lowerCaseName;
if (annotations.isPresent(Alias.class)) {
String rawName = annotations.get(Alias.class).synthesize().value();
lowerCaseName = lowerCaseName(rawName.replace("_", ""));
} else {
lowerCaseName = lowerCaseName(pd.getName());
}
this.mappedFields.put(lowerCaseName, pd);
String underscoreName = underscoreName(pd.getName());
if (!lowerCaseName.equals(underscoreName)) {
this.mappedFields.put(underscoreName, pd);
}
// 添加json字段
if (annotations.isPresent(JSONInject.class)) {
this.jsonFields.put(pd.getName(), pd.getPropertyType());
}
}
}
}
/**
* Remove the specified property from the mapped fields.
*
* @param propertyName the property name (as used by property descriptors)
* @since 5.3.9
*/
protected void suppressProperty(String propertyName) {
if (this.mappedFields != null) {
this.mappedFields.remove(lowerCaseName(propertyName));
this.mappedFields.remove(underscoreName(propertyName));
}
}
/**
* Convert the given name to lower case.
* By default, conversions will happen within the US locale.
*
* @param name the original name
* @return the converted name
* @since 4.2
*/
protected String lowerCaseName(String name) {
return name.toLowerCase(Locale.US);
}
/**
* Convert a name in camelCase to an underscored name in lower case.
* Any upper case letters are converted to lower case with a preceding underscore.
*
* @param name the original name
* @return the converted name
* @see #lowerCaseName
* @since 4.2
*/
protected String underscoreName(String name) {
if (!StringUtils.hasLength(name)) {
return "";
}
StringBuilder result = new StringBuilder();
result.append(Character.toLowerCase(name.charAt(0)));
for (int i = 1; i < name.length(); i++) {
char c = name.charAt(i);
if (Character.isUpperCase(c)) {
result.append('_').append(Character.toLowerCase(c));
} else {
result.append(c);
}
}
return result.toString();
}
/**
* Extract the values for all columns in the current row.
* <p>Utilizes public setters and result set meta-data.
*
* @see java.sql.ResultSetMetaData
*/
@Override
public T mapRow(ResultSet rs, int rowNumber) throws SQLException {
BeanWrapperImpl bw = new BeanWrapperImpl();
initBeanWrapper(bw);
T mappedObject = constructMappedInstance(rs, bw);
bw.setBeanInstance(mappedObject);
ResultSetMetaData rsmd = rs.getMetaData();
int columnCount = rsmd.getColumnCount();
for (int index = 1; index <= columnCount; index++) {
String column = JdbcUtils.lookupColumnName(rsmd, index);
String field = lowerCaseName(StringUtils.delete(column, " "));
PropertyDescriptor pd = (this.mappedFields != null ? this.mappedFields.get(field) : null);
if (pd != null) {
try {
Object value = getColumnValue(rs, index, pd);
if (rowNumber == 0 && log.isDebugEnabled()) {
log.debug("Mapping column '" + column + "' to property '" + pd.getName() +
"' of type '" + ClassUtils.getQualifiedName(pd.getPropertyType()) + "'");
}
if (jsonFields.containsKey(pd.getName())) {
value = convert(value, jsonFields.get(pd.getName()));
}
bw.setPropertyValue(pd.getName(), value);
} catch (NotWritablePropertyException ex) {
throw new DataRetrievalFailureException(
"Unable to map column '" + column + "' to property '" + pd.getName() + "'", ex);
}
}
}
return mappedObject;
}
/**
* Construct an instance of the mapped class for the current row.
*
* @param rs the ResultSet to map (pre-initialized for the current row)
* @param tc a TypeConverter with this RowMapper's conversion service
* @return a corresponding instance of the mapped class
* @throws SQLException if an SQLException is encountered
* @since 5.3
*/
protected T constructMappedInstance(ResultSet rs, TypeConverter tc) throws SQLException {
Assert.state(this.mappedClass != null, "Mapped class was not specified");
return BeanUtils.instantiateClass(this.mappedClass);
}
/**
* Initialize the given BeanWrapper to be used for row mapping.
* To be called for each row.
* <p>The default implementation applies the configured {@link ConversionService},
* if any. Can be overridden in subclasses.
*
* @param bw the BeanWrapper to initialize
* @see #getConversionService()
* @see BeanWrapper#setConversionService
*/
protected void initBeanWrapper(BeanWrapper bw) {
ConversionService cs = getConversionService();
if (cs != null) {
bw.setConversionService(cs);
}
}
/**
* Retrieve a JDBC object value for the specified column.
* <p>The default implementation delegates to
* {@link #getColumnValue(ResultSet, int, Class)}.
*
* @param rs is the ResultSet holding the data
* @param index is the column index
* @param pd the bean property that each result object is expected to match
* @return the Object value
* @throws SQLException in case of extraction failure
* @see #getColumnValue(ResultSet, int, Class)
*/
@Nullable
protected Object getColumnValue(ResultSet rs, int index, PropertyDescriptor pd) throws SQLException {
return JdbcUtils.getResultSetValue(rs, index, pd.getPropertyType());
}
/**
* Retrieve a JDBC object value for the specified column.
* <p>The default implementation calls
* {@link JdbcUtils#getResultSetValue(java.sql.ResultSet, int, Class)}.
* Subclasses may override this to check specific value types upfront,
* or to post-process values return from {@code getResultSetValue}.
*
* @param rs is the ResultSet holding the data
* @param index is the column index
* @param paramType the target parameter type
* @return the Object value
* @throws SQLException in case of extraction failure
* @see org.springframework.jdbc.support.JdbcUtils#getResultSetValue(java.sql.ResultSet, int, Class)
* @since 5.3
*/
@Nullable
protected Object getColumnValue(ResultSet rs, int index, Class<?> paramType) throws SQLException {
return JdbcUtils.getResultSetValue(rs, index, paramType);
}
/**
* 转换json对象
*
* @param type 目标类型
* @param value
* @return 结果
*/
private Object convert(Object value, Class<?> type) {
if (value instanceof String) {
try {
return objectMapper.readValue((String) value, type);
} catch (JsonProcessingException e) {
log.error("转换json为对象时出错{}", e.getMessage());
return null;
}
}
return value;
}
}

View File

@ -0,0 +1,27 @@
package group.flyfish.fluent.query;
import group.flyfish.fluent.chain.SQLSegment;
import lombok.AllArgsConstructor;
import lombok.Getter;
/**
* 连接用的候选
*
* @author wangyu
*/
@AllArgsConstructor
@Getter
public enum ConcatCandidate implements SQLSegment {
AND(""), OR("");
private final String name;
/**
* @return 得到sql片段
*/
@Override
public String get() {
return name();
}
}

View File

@ -0,0 +1,47 @@
package group.flyfish.fluent.query;
import group.flyfish.fluent.chain.SQLSegment;
import group.flyfish.fluent.utils.sql.SFunction;
import java.util.Collection;
/**
* 条件sql链
*
* @author wangyu
*/
public interface Condition extends Parameterized, SQLSegment {
/**
* 等于条件
*
* @param value
* @return 查询链
*/
Query eq(Object value);
/**
* 等于另外一个字段
*
* @param ref 引用
* @param <T> 泛型
* @return 查询链
*/
<T> Query eq(SFunction<T, ?> ref);
/**
* 模糊查询
*
* @param pattern 关键字
* @return 查询链
*/
Query like(String pattern);
/**
* 在集合内
*
* @param collection 任意内容集合
* @return 查询链
*/
Query in(Collection<?> collection);
}

View File

@ -0,0 +1,72 @@
package group.flyfish.fluent.query;
import group.flyfish.fluent.utils.text.TemplateCompiler;
import lombok.Getter;
import java.util.Collection;
import java.util.HashMap;
import java.util.Map;
import java.util.function.Function;
import java.util.stream.Collectors;
/**
* 查询条件候选模板
*
* @author wangyu
*/
@Getter
enum ConditionCandidate {
EQ("字段等于值", "{column} = ?"),
NE("字段不等于值", "{column} != ?"),
GT("字段大于值", "{column} > ?"),
GTE("字段大于等于值", "{column} >= ?"),
LT("字段小于值", "{column} < ?"),
LTE("字段小于等于值", "{column} <= ?"),
LIKE("字段模糊匹配值", "{column} LIKE CONCAT('%', ?, '%')"),
LIKE_LEFT("字段匹配左半部分值", "{column} LIKE CONCAT(?, '%')"),
LIKE_RIGHT("字段匹配右半部分值", "{column} LIKE CONCAT('%', ?)"),
IN("字段在值列表内", "{column} IN ({?})", ConditionCandidate::multiple),
NIN("字段不在值列表内", "{column} NOT IN ({?))", ConditionCandidate::multiple),
NOT_NULL("字段不为空", "{column} IS NOT NULL"),
IS_NULL("字段为空", "{column} IS NULL"),
BETWEEN("字段介于列表下标0和1的值之间", "{column} BETWEEN ? and ?"),
DATE_GTE("日期字段大于值", "{column} > ?"),
DATE_LTE("日期字段小于值", "{column} < ?");
private final String name;
private final TemplateCompiler.DynamicValue template;
private final Function<Object, ?> mapper;
ConditionCandidate(String name, String template) {
this(name, template, null);
}
ConditionCandidate(String name, String template, Function<Object, ?> mapper) {
this.name = name;
this.template = TemplateCompiler.compile(template);
this.mapper = mapper;
}
private static String multiple(Object value) {
return value instanceof Collection ?
((Collection<?>) value).stream().map(v -> "?").collect(Collectors.joining(", ")) : "?";
}
/**
* 编译并取得值
*
* @param field 字段
* @param value
* @return 结果
*/
public String compile(String field, Object value) {
Map<String, Object> params = new HashMap<>();
params.put("column", field);
params.put("value", value);
params.put("?", null != mapper ? mapper.apply(value) : "?");
return template.apply(params);
}
}

View File

@ -0,0 +1,26 @@
package group.flyfish.fluent.query;
import group.flyfish.fluent.chain.SQLSegment;
import lombok.AllArgsConstructor;
import lombok.Getter;
@Getter
@AllArgsConstructor
public enum JoinCandidate implements SQLSegment {
INNER_JOIN("内连接", "INNER JOIN"),
LEFT_JOIN("左连接", "LEFT JOIN"),
RIGHT_JOIN("右连接", "RIGHT JOIN");
private final String name;
private final String content;
/**
* @return 得到sql片段
*/
@Override
public String get() {
return content;
}
}

View File

@ -0,0 +1,32 @@
package group.flyfish.fluent.query;
import org.springframework.lang.Nullable;
import java.util.Collection;
/**
* 表示带参数的类
*
* @author wangyu
*/
public interface Parameterized {
/**
* 获取当前对象包含的参数
* 返回空集合代表不需要参数
* 返回null代表参数为空不处理该条件或查询
*
* @return 结果
*/
@Nullable
Collection<Object> getParameters();
/**
* 是否为空参数为null则为空
*
* @return 结果
*/
default boolean isEmpty() {
return null == getParameters();
}
}

View File

@ -0,0 +1,72 @@
package group.flyfish.fluent.query;
import group.flyfish.fluent.chain.SQLSegment;
import group.flyfish.fluent.utils.sql.SFunction;
/**
* 查询构建入口
*
* @author wangyu
*/
public interface Query extends Parameterized, SQLSegment {
/**
* 从where开始
*
* @param getter 字段lambda
* @param <T> 实体泛型
* @return 构建条件
*/
static <T> Condition where(SFunction<T, ?> getter) {
return new SimpleCondition(getter, new SimpleQuery());
}
/**
* 以且连接下一个条件
*
* @param getter 字段lambda
* @return 构建操作
*/
<T> Condition and(SFunction<T, ?> getter);
/**
* 以且直接连接其他条件
*
* @param condition 其他条件
* @return 查询操作
*/
Query and(Condition condition);
/**
* 以且嵌套其他查询条件
*
* @param query 查询条件
* @return 结果
*/
Query and(Query query);
/**
* 以或连接下一个条件
*
* @param getter 字段lambda
* @return 构建操作
*/
<T> Condition or(SFunction<T, ?> getter);
/**
* 以或直接连接其他条件
*
* @param condition 其他条件
* @return 查询操作
*/
Query or(Condition condition);
/**
* 以或嵌套其他查询条件
*
* @param query 查询条件
* @return 结果
*/
Query or(Query query);
}

View File

@ -0,0 +1,115 @@
package group.flyfish.fluent.query;
import group.flyfish.fluent.utils.sql.EntityNameUtils;
import group.flyfish.fluent.utils.sql.SFunction;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
import java.util.Collection;
import java.util.Collections;
import java.util.function.Function;
/**
* 查询条件
*
* @author wangyu
*/
class SimpleCondition implements Condition {
private final SFunction<?, ?> target;
private final Function<Condition, Query> callback;
private Object value;
private ConditionCandidate candidate;
public SimpleCondition(SFunction<?, ?> target, SimpleQuery query) {
this(target, query::and);
}
public SimpleCondition(SFunction<?, ?> target, Function<Condition, Query> callback) {
this.target = target;
this.callback = callback;
}
@Override
@Nullable
public Collection<Object> getParameters() {
if (ObjectUtils.isEmpty(value)) {
return null;
}
if (value instanceof Collection) {
return cast(value);
}
if (value instanceof SFunction) {
return Collections.emptyList();
}
return Collections.singletonList(value);
}
/**
* @return 得到sql片段
*/
@Override
public String get() {
String compiled = candidate.compile(target.getName(), value);
// 值属于引用时替换参数为引用
if (value instanceof SFunction) {
return compiled.replace("?", EntityNameUtils.toName(cast(value)));
}
return compiled;
}
/**
* 等于条件
*
* @param value
* @return 查询链
*/
@Override
public Query eq(Object value) {
this.value = value;
this.candidate = ConditionCandidate.EQ;
return callback.apply(this);
}
/**
* 等于另外一个字段
*
* @param ref 引用
* @return 查询链
*/
@Override
public <T> Query eq(SFunction<T, ?> ref) {
this.value = ref;
this.candidate = ConditionCandidate.EQ;
return callback.apply(this);
}
/**
* 模糊查询
*
* @param pattern 关键字
* @return 查询链
*/
@Override
public Query like(String pattern) {
this.value = pattern;
this.candidate = ConditionCandidate.LIKE;
return callback.apply(this);
}
/**
* 在集合内
*
* @param collection 任意内容集合
* @return 查询链
*/
@Override
public Query in(Collection<?> collection) {
this.value = collection;
this.candidate = ConditionCandidate.IN;
return callback.apply(this);
}
}

View File

@ -0,0 +1,142 @@
package group.flyfish.fluent.query;
import group.flyfish.fluent.chain.SQLSegment;
import group.flyfish.fluent.utils.sql.ConcatSegment;
import group.flyfish.fluent.utils.sql.SFunction;
import org.springframework.lang.Nullable;
import org.springframework.util.CollectionUtils;
import java.util.ArrayList;
import java.util.Collection;
import java.util.stream.Collectors;
import static group.flyfish.fluent.query.ConcatCandidate.AND;
import static group.flyfish.fluent.query.ConcatCandidate.OR;
/**
* 查询构建器api
*
* @author wangyu
*/
class SimpleQuery extends ConcatSegment<SimpleQuery> implements Query {
// 参数源
private final Collection<Object> parameters = new ArrayList<>();
/**
* @return 得到sql片段
*/
@Override
public String get() {
return segments.stream().map(SQLSegment::get).collect(Collectors.joining(" "));
}
/**
* 获取参数源
*
* @return 结果
*/
@Override
@Nullable
public Collection<Object> getParameters() {
if (segments.isEmpty()) {
return null;
}
return parameters;
}
/**
* 以与连接下一个条件
*
* @param getter 字段lambda
* @return 构建操作
*/
@Override
public <T> Condition and(SFunction<T, ?> getter) {
return new SimpleCondition(getter, this::and);
}
/**
* 直接连接其他条件
*
* @param condition 其他条件
* @return 查询操作
*/
@Override
public Query and(Condition condition) {
if (condition.isEmpty()) {
return this;
}
addParameters(condition);
return concat(AND).concat(condition);
}
/**
* 嵌套其他查询条件
*
* @param query 查询条件
* @return 结果
*/
@Override
public Query and(Query query) {
if (query.isEmpty()) {
return this;
}
addParameters(query);
return concat(AND).concat(query);
}
/**
* 以或连接下一个条件
*
* @param getter 字段lambda
* @return 构建操作
*/
@Override
public <T> Condition or(SFunction<T, ?> getter) {
return new SimpleCondition(getter, this::or);
}
/**
* 以或直接连接其他条件
*
* @param condition 其他条件
* @return 查询操作
*/
@Override
public Query or(Condition condition) {
if (condition.isEmpty()) {
return this;
}
addParameters(condition);
return concat(OR).concat(condition);
}
/**
* 以或嵌套其他查询条件
*
* @param query 查询条件
* @return 结果
*/
@Override
public Query or(Query query) {
if (query.isEmpty()) {
return this;
}
addParameters(query);
return concat(OR).concat(query);
}
/**
* 添加参数
*
* @param parameterized 带参数的对象
*/
private void addParameters(Parameterized parameterized) {
Collection<Object> params = parameterized.getParameters();
if (!CollectionUtils.isEmpty(params)) {
// 优先拼接参数
parameters.addAll(params);
}
}
}

View File

@ -0,0 +1,52 @@
package group.flyfish.fluent.update;
import group.flyfish.fluent.chain.SQLSegment;
import group.flyfish.fluent.chain.update.AfterSetSqlChain;
import group.flyfish.fluent.query.Parameterized;
import group.flyfish.fluent.utils.sql.SFunction;
/**
* 更新内容类
*
* @author wangyu
*/
public interface Update extends SQLSegment, Parameterized {
/**
* 基于对象设置所有同名值
*
* @param value
* @return 链式
*/
default Update setAll(Object value) {
throw new UnsupportedOperationException("当前未实现该操作!");
}
/**
* 设置值来自具体值
*
* @param target 目标字段
* @param value 具体值
* @param <T> 泛型
* @return 链式
*/
<T> Update set(SFunction<T, ?> target, Object value);
/**
* 设置值来自其他表字段
*
* @param target 目标字段
* @param source 源端字段
* @param <T> 目标泛型
* @param <V> 源端泛型
* @return 链式
*/
<T, V> Update set(SFunction<T, ?> target, SFunction<V, ?> source);
/**
* 接下来要做的事
*
* @return 结果
*/
AfterSetSqlChain then();
}

View File

@ -0,0 +1,110 @@
package group.flyfish.fluent.update;
import group.flyfish.fluent.chain.SQLSegment;
import group.flyfish.fluent.chain.update.AfterSetSqlChain;
import group.flyfish.fluent.utils.sql.ConcatSegment;
import group.flyfish.fluent.utils.sql.SFunction;
import lombok.RequiredArgsConstructor;
import org.springframework.lang.Nullable;
import org.springframework.util.ObjectUtils;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.function.Function;
import java.util.stream.Collectors;
/**
* 更新实现
*
* @author wangyu
*/
@RequiredArgsConstructor
public class UpdateImpl implements Update {
private final Function<Update, AfterSetSqlChain> chain;
// 参数源
private final Collection<Object> parameters = new ArrayList<>();
// 片段
private final List<SQLSegment> segments = new ArrayList<>();
/**
* 设置值来自具体值
*
* @param target 目标字段
* @param value 具体值
* @return 链式
*/
@Override
public <T> Update set(SFunction<T, ?> target, Object value) {
if (ObjectUtils.isEmpty(value)) {
return this;
}
this.parameters.add(value);
segments.add(new UpdatePart().concat(target::getName).concat("=").concat("?"));
return this;
}
/**
* 设置值来自其他表字段
*
* @param target 目标字段
* @param source 源端字段
* @return 链式
*/
@Override
public <T, V> Update set(SFunction<T, ?> target, SFunction<V, ?> source) {
segments.add(new UpdatePart().concat(target::getName).concat("=").concat(source::getName));
return this;
}
/**
* 接下来要做的事
*
* @return 结果
*/
@Override
public AfterSetSqlChain then() {
return chain.apply(this);
}
/**
* @return 得到sql片段
*/
@Override
public String get() {
return segments.stream().map(SQLSegment::get).collect(Collectors.joining(", "));
}
/**
* 获取当前对象包含的参数
* 返回空集合代表不需要参数
* 返回null代表参数为空不处理该条件或查询
*
* @return 结果
*/
@Override
@Nullable
public Collection<Object> getParameters() {
if (segments.isEmpty()) {
return null;
}
return parameters;
}
/**
* 内部的部分
*/
private static class UpdatePart extends ConcatSegment<UpdatePart> implements SQLSegment {
/**
* @return 得到sql片段
*/
@Override
public String get() {
return this.segments.stream().map(SQLSegment::get).collect(Collectors.joining(" "));
}
}
}

View File

@ -0,0 +1,36 @@
package group.flyfish.fluent.utils.cache;
import java.util.LinkedHashMap;
import java.util.Map;
/**
* LRU (least recently used)最近最久未使用缓存<br>
* 根据使用时间来判定对象是否被持续缓存<br>
* 当对象被访问时放入缓存当缓存满了最久未被使用的对象将被移除<br>
* 此缓存基于LinkedHashMap因此当被缓存的对象每被访问一次这个对象的key就到链表头部<br>
* 这个算法简单并且非常快他比FIFO有一个显著优势是经常使用的对象不太可能被移除缓存<br>
* 缺点是当缓存满时不能被很快的访问
*
* @param <K> 键类型
* @param <V> 值类型
* @author wangyu
*/
public class LRUCache<K, V> extends LinkedHashMap<K, V> {
private static final long serialVersionUID = 1L;
private final int maxSize;
public LRUCache(int maxSize) {
this(maxSize, 16, 0.75f, false);
}
public LRUCache(int maxSize, int initialCapacity, float loadFactor, boolean accessOrder) {
super(initialCapacity, loadFactor, accessOrder);
this.maxSize = maxSize;
}
@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return this.size() > this.maxSize;
}
}

View File

@ -0,0 +1,182 @@
package group.flyfish.fluent.utils.context;
import group.flyfish.fluent.utils.sql.SFunction;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.atomic.AtomicInteger;
/**
* 别名管理器
* 因sql构建器运行时线程确定故该类使用线程局部变量
*
* @author wangyu
*/
public final class AliasComposite {
private static final String PREFIX = "t";
// 列别名存储本质上是一个简单map线程局部缓存阅后即焚
private static final ThreadLocal<AliasCache> ALIAS = new ThreadLocal<>();
/**
* 添加别名
*
* @param key 别名key
* @param alias 别名
*/
public static String add(Class<?> key, String alias) {
return sharedCache().add(key, alias);
}
/**
* 添加别名
*
* @param key 别名key
* @param alias 别名
*/
public static <T> String add(SFunction<T, ?> key, String alias) {
return sharedCache().add(key, alias);
}
/**
* 判断表是否有别名
*
* @param key 实体类
* @return 是否存在
*/
public static boolean has(Class<?> key) {
return sharedCache().has(key);
}
/**
* 判断表是否有别名
*
* @param key 实体类
* @return 是否存在
*/
public static <T> boolean has(SFunction<T, ?> key) {
return sharedCache().has(key);
}
/**
* 获取别名
*
* @param key
* @return 结果
*/
public static String get(Class<?> key) {
return sharedCache().get(key);
}
/**
* 获取别名
*
* @param key
* @return 结果
*/
public static <T> String get(SFunction<T, ?> key) {
return sharedCache().get(key);
}
/**
* 清空别名缓存
*/
public static void flush() {
ALIAS.remove();
}
public static AliasCache sharedCache() {
AliasCache cache = ALIAS.get();
if (null == cache) {
cache = new AliasCache();
ALIAS.set(cache);
}
return cache;
}
public static class AliasCache {
// 表别名缓存map
private final Map<Class<?>, String> instance = new ConcurrentHashMap<>();
// 表别名内置计数
private final AtomicInteger counter = new AtomicInteger(0);
// 列别名缓存map
private final Map<SFunction<?, ?>, String> columns = new ConcurrentHashMap<>();
/**
* 添加缓存基本规则
* 1. 新缓存指定手动别名优先以手动别名为准
* 2. 未指定手动别名且存在缓存的不更新别名
*
* @param key 缓存key
* @param alias 缓存别名可为空
* @return 返回缓存的别名
*/
public String add(Class<?> key, @Nullable String alias) {
if (StringUtils.hasText(alias)) {
instance.put(key, alias);
return alias;
} else {
return get(key);
}
}
public boolean has(Class<?> key) {
return instance.containsKey(key);
}
public String get(Class<?> key) {
return instance.computeIfAbsent(key, this::generate);
}
/**
* 生成key
*
* @param key 类型key
* @return 生成的key
*/
public String generate(Class<?> key) {
return PREFIX + counter.incrementAndGet();
}
/**
* 判断是否含有某个列引用的别名
*
* @param column 列引用
* @param <T> 泛型
* @return 是否存在
*/
public <T> boolean has(SFunction<T, ?> column) {
return columns.containsKey(column);
}
/**
* 列缓存
*
* @param column 列引用
* @param alias 别名
* @param <T> 实体泛型
*/
public <T> String add(SFunction<T, ?> column, String alias) {
columns.put(column, alias);
return alias;
}
/**
* 获取列引用所绑定的别名
*
* @param column 列引用
* @param <T> 泛型
* @return 别名可能为空
*/
public <T> String get(SFunction<T, ?> column) {
return columns.get(column);
}
}
}

View File

@ -0,0 +1,32 @@
package group.flyfish.fluent.utils.data;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import java.text.SimpleDateFormat;
/**
* 项目唯一的jackson工具
*
* @author wangyu
*/
public final class ObjectMappers {
private static final ObjectMapper objectMapper;
static {
objectMapper = new ObjectMapper();
objectMapper.setDateFormat(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"));
objectMapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
objectMapper.disable(SerializationFeature.FAIL_ON_EMPTY_BEANS);
}
private ObjectMappers() {
}
public static ObjectMapper shared() {
return objectMapper;
}
}

View File

@ -0,0 +1,31 @@
package group.flyfish.fluent.utils.data;
import com.fasterxml.jackson.core.JsonProcessingException;
import org.springframework.beans.BeanUtils;
import java.security.InvalidParameterException;
/**
* 参数工具类
*
* @author wangyu
*/
public final class ParameterUtils {
public static Object convert(Object value) {
if (null == value) {
return null;
}
if (value instanceof Enum) {
return String.valueOf(value);
}
if (BeanUtils.isSimpleProperty(value.getClass())) {
return value;
}
try {
return ObjectMappers.shared().writeValueAsString(value);
} catch (JsonProcessingException e) {
throw new InvalidParameterException("不是一个json数据或者是未识别的类");
}
}
}

View File

@ -0,0 +1,63 @@
package group.flyfish.fluent.utils.sql;
import group.flyfish.fluent.chain.SQLSegment;
import group.flyfish.fluent.query.ConcatCandidate;
import lombok.RequiredArgsConstructor;
import java.util.ArrayList;
import java.util.List;
/**
* 可连接的片段
*
* @author wangyu
*/
public abstract class ConcatSegment<T extends ConcatSegment<T>> {
// 处理链集合
protected final List<SQLSegment> segments = new ArrayList<>();
/**
* 拼接sql片段
*
* @param segment 片段
* @return 链式调用
*/
public T concat(SQLSegment segment) {
if (segments.isEmpty() && segment instanceof ConcatCandidate) {
return self();
}
this.segments.add(segment);
return self();
}
/**
* 拼接静态sql片段
*
* @param content 内容
* @return 结果
*/
public T concat(String content) {
this.segments.add(new StaticSegment(content));
return self();
}
@SuppressWarnings("unchecked")
private T self() {
return (T) this;
}
@RequiredArgsConstructor
private static class StaticSegment implements SQLSegment {
private final String content;
/**
* @return 得到sql片段
*/
@Override
public String get() {
return content;
}
}
}

View File

@ -0,0 +1,157 @@
package group.flyfish.fluent.utils.sql;
import group.flyfish.fluent.utils.cache.LRUCache;
import group.flyfish.fluent.utils.context.AliasComposite;
import lombok.AccessLevel;
import lombok.NoArgsConstructor;
import org.springframework.util.ReflectionUtils;
import org.springframework.util.StringUtils;
import javax.persistence.Column;
import javax.persistence.Table;
import java.lang.ref.WeakReference;
import java.lang.reflect.Field;
import java.util.HashMap;
import java.util.Map;
import java.util.Optional;
import java.util.concurrent.ConcurrentHashMap;
import java.util.function.BinaryOperator;
import static group.flyfish.fluent.utils.sql.SqlNameUtils.wrap;
/**
* 属性名字处理器
*
* @author wangyu
*/
@NoArgsConstructor(access = AccessLevel.PRIVATE)
public final class EntityNameUtils {
// SerializedLambda 反序列化缓存
private static final Map<String, WeakReference<SerializedLambda>> FUNC_CACHE = new ConcurrentHashMap<>();
// 列别名缓存
private static final Map<Class<?>, Map<String, String>> COLUMN_CACHE = new LRUCache<>(5);
// 表名缓存
private static final Map<Class<?>, String> TABLE_CACHE = new LRUCache<>(3);
public static <T> String toName(SFunction<T, ?> func) {
return resolve(func, (column, property) -> wrap(column));
}
public static <T> String toSelect(SFunction<T, ?> func) {
return resolve(func,
(column, property) -> String.join(" ", wrap(column), "as", wrap(property)));
}
public static Map<String, String> getFields(Class<?> clazz) {
tryCache(clazz);
return COLUMN_CACHE.get(clazz);
}
/**
* 从一个实体类中取得表名
*
* @param entityClass 实体类
* @return 结果
*/
public static String getTableName(Class<?> entityClass) {
return TABLE_CACHE.computeIfAbsent(entityClass, k -> {
Table table = entityClass.getAnnotation(Table.class);
if (null != table && StringUtils.hasText(table.name())) {
return table.name();
}
return wrap(SqlNameUtils.camelToUnderline(entityClass.getSimpleName()));
});
}
/**
* 解析列并使用处理函数处理
*
* @param func 方法引用
* @param handler 处理器
* @param <T> 实体泛型
* @return 处理结果
*/
private static <T> String resolve(SFunction<T, ?> func, BinaryOperator<String> handler) {
SerializedLambda lambda = resolve(func);
String property = SqlNameUtils.methodToProperty(lambda.getImplMethodName());
Class<?> beanClass = resolveEntityClass(lambda);
String column = COLUMN_CACHE.get(beanClass).getOrDefault(property, SqlNameUtils.camelToUnderline(property));
// 取得别名缓存
AliasComposite.AliasCache cache = AliasComposite.sharedCache();
// 确定最终名称
String finalName = cache.has(func) ? cache.get(func) : property;
if (cache.has(beanClass)) {
return cache.get(beanClass) + "." + handler.apply(column, finalName);
}
return handler.apply(column, finalName);
}
/**
* 解析方法引用为序列化lambda实例
* 该方式会使用缓存
*
* @param func 方法引用
* @param <T> 泛型
* @return 解析结果
*/
private static <T> SerializedLambda resolve(SFunction<T, ?> func) {
Class<?> clazz = func.getClass();
String name = clazz.getName();
return Optional.ofNullable(FUNC_CACHE.get(name))
.map(WeakReference::get)
.orElseGet(() -> {
SerializedLambda lambda = SerializedLambda.resolve(func);
FUNC_CACHE.put(name, new WeakReference<>(lambda));
return lambda;
});
}
/**
* 解析获得实体类
*
* @param lambda 序列化的lambda
* @return 最终获取的类
*/
private static Class<?> resolveEntityClass(SerializedLambda lambda) {
return tryCache(lambda.getInstantiatedType());
}
/**
* 构建字段缓存
*
* @param type 类型
* @return 构建后的缓存
*/
private static Map<String, String> buildFieldsCache(Class<?> type) {
Map<String, String> fields = new HashMap<>();
ReflectionUtils.doWithFields(type, field -> fields.put(field.getName(), resolveFinalName(field)));
return fields;
}
/**
* 解析字段注解或直接取用下划线逻辑
*
* @return 解析结果
*/
private static String resolveFinalName(Field field) {
Column column = field.getAnnotation(Column.class);
if (null != column && StringUtils.hasText(column.name())) {
return column.name();
}
return SqlNameUtils.camelToUnderline(field.getName());
}
/**
* 尝试缓存
*
* @param entityClass bean的类型
*/
private static Class<?> tryCache(Class<?> entityClass) {
COLUMN_CACHE.computeIfAbsent(entityClass, EntityNameUtils::buildFieldsCache);
return entityClass;
}
}

View File

@ -0,0 +1,82 @@
package group.flyfish.fluent.utils.sql;
import group.flyfish.fluent.utils.context.AliasComposite;
import lombok.RequiredArgsConstructor;
import java.io.Serializable;
import java.util.function.Function;
import java.util.function.Supplier;
import static group.flyfish.fluent.utils.sql.SqlNameUtils.wrap;
/**
* 支持序列化的function
*
* @param <T> 泛型入参
* @param <R> 泛型返回值
*/
@FunctionalInterface
public interface SFunction<T, R> extends Function<T, R>, Serializable {
/**
* 快速获取名称
*
* @return 序列化后的名称
*/
default String getName() {
return EntityNameUtils.toName(this);
}
/**
* 快速获取选择语句附带别名
*
* @return 结果
*/
default String getSelect() {
return EntityNameUtils.toSelect(this);
}
@SuppressWarnings("unchecked")
default <V, P> SFunction<V, P> cast() {
return (SFunction<V, P>) this;
}
/**
* 覆盖写法
*
* @param <T> 泛型
* @param <R> 泛型
*/
@RequiredArgsConstructor
class StaticRef<T, R> implements SFunction<T, R> {
private final Class<?> type;
private final String name;
private final String column;
@Override
public String getName() {
return handle(() -> wrap(name));
}
@Override
public String getSelect() {
return handle(() -> String.join(" ", wrap(column), "as", wrap(name)));
}
private String handle(Supplier<String> handler) {
if (AliasComposite.has(type)) {
return AliasComposite.get(type) + "." + handler.get();
}
return handler.get();
}
@Override
public R apply(T t) {
return null;
}
}
}

View File

@ -0,0 +1,156 @@
/*
* Copyright (c) 2011-2021, baomidou (jobob@qq.com).
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package group.flyfish.fluent.utils.sql;
import org.springframework.util.Assert;
import org.springframework.util.ClassUtils;
import org.springframework.util.SerializationUtils;
import java.io.*;
/**
* 这个类是从 {@link java.lang.invoke.SerializedLambda} 里面 copy 过来的
* 字段信息完全一样
* <p>负责将一个支持序列的 Function 序列化为 SerializedLambda</p>
*
* @author mbp
*/
@SuppressWarnings("unused")
class SerializedLambda implements Serializable {
private static final long serialVersionUID = 8025925345765570181L;
private Class<?> capturingClass;
private String functionalInterfaceClass;
private String functionalInterfaceMethodName;
private String functionalInterfaceMethodSignature;
private String implClass;
private String implMethodName;
private String implMethodSignature;
private int implMethodKind;
private String instantiatedMethodType;
private Object[] capturedArgs;
/**
* 通过反序列化转换 lambda 表达式该方法只能序列化 lambda 表达式不能序列化接口实现或者正常非 lambda 写法的对象
*
* @param lambda lambda对象
* @return 返回解析后的 SerializedLambda
*/
public static SerializedLambda resolve(SFunction<?, ?> lambda) {
Assert.isTrue(lambda.getClass().isSynthetic(), "该方法仅能传入 lambda 表达式产生的合成类");
byte[] stream = SerializationUtils.serialize(lambda);
Assert.notNull(stream, "序列化类失败!");
try (ObjectInputStream objIn = new ObjectInputStream(new ByteArrayInputStream(stream)) {
@Override
protected Class<?> resolveClass(ObjectStreamClass objectStreamClass) throws IOException, ClassNotFoundException {
Class<?> clazz;
try {
clazz = forName(objectStreamClass.getName());
} catch (Exception ex) {
clazz = super.resolveClass(objectStreamClass);
}
return clazz == java.lang.invoke.SerializedLambda.class ? SerializedLambda.class : clazz;
}
}) {
return (SerializedLambda) objIn.readObject();
} catch (ClassNotFoundException | IOException e) {
throw new NullPointerException("不应该发生这件事");
}
}
/**
* 以类名实例化类会隐藏报错需要在确定有该类的情况下调用
*
* @param name 类型
* @return 实例
*/
private static Class<?> forName(String name) {
try {
return ClassUtils.forName(name, null);
} catch (ClassNotFoundException e) {
return null;
}
}
/**
* 获取接口 class
*
* @return 返回 class 名称
*/
public String getFunctionalInterfaceClassName() {
return normalizedName(functionalInterfaceClass);
}
/**
* 获取实现的 class
*
* @return 实现类
*/
public Class<?> getImplClass() {
return forName(getImplClassName());
}
/**
* 获取 class 的名称
*
* @return 类名
*/
public String getImplClassName() {
return normalizedName(implClass);
}
/**
* 获取实现者的方法名称
*
* @return 方法名称
*/
public String getImplMethodName() {
return implMethodName;
}
/**
* 正常化类名称将类名称中的 / 替换为 .
*
* @param name 名称
* @return 正常的类名
*/
private String normalizedName(String name) {
return name.replace('/', '.');
}
/**
* @return 获取实例化方法的类型
*/
public Class<?> getInstantiatedType() {
String instantiatedTypeName = normalizedName(instantiatedMethodType.substring(2, instantiatedMethodType.indexOf(';')));
return forName(instantiatedTypeName);
}
/**
* @return 字符串形式
*/
@Override
public String toString() {
String interfaceName = getFunctionalInterfaceClassName();
String implName = getImplClassName();
return String.format("%s -> %s::%s",
interfaceName.substring(interfaceName.lastIndexOf('.') + 1),
implName.substring(implName.lastIndexOf('.') + 1),
implMethodName);
}
}

View File

@ -0,0 +1,103 @@
package group.flyfish.fluent.utils.sql;
import java.security.InvalidParameterException;
import java.util.Locale;
public class SqlNameUtils {
/**
* 方法转为属性
*
* @param name 方法名称
* @return 最终属性
*/
public static String methodToProperty(String name) {
if (name.startsWith("is")) {
name = name.substring(2);
} else {
if (!name.startsWith("get") && !name.startsWith("set")) {
throw new InvalidParameterException("不是正确的getter或setter");
}
name = name.substring(3);
}
if (name.length() == 1 || name.length() > 1 && !Character.isUpperCase(name.charAt(1))) {
name = name.substring(0, 1).toLowerCase(Locale.ENGLISH) + name.substring(1);
}
return name;
}
/**
* 将下划线大写方式命名的字符串转换为驼峰式
* 如果转换前的下划线大写方式命名的字符串为空则返回空字符串</br>
* 例如hello_world->helloWorld
*
* @param name 转换前的下划线大写方式命名的字符串
* @return 转换后的驼峰式命名的字符串
*/
public static String camelName(String name) {
StringBuilder result = new StringBuilder();
// 快速检查
if (name == null || name.isEmpty()) {
// 没必要转换
return "";
} else if (!name.contains("_")) {
// 不含下划线仅将首字母小写
//update-begin--Author:zhoujf Date:20180503 forTASK #2500 代码生成器代码生成器开发一通用模板生成功能
//update-begin--Author:zhoujf Date:20180503 forTASK #2500 代码生成器代码生成器开发一通用模板生成功能
return name.substring(0, 1).toLowerCase() + name.substring(1).toLowerCase();
//update-end--Author:zhoujf Date:20180503 forTASK #2500 代码生成器代码生成器开发一通用模板生成功能
}
// 用下划线将原始字符串分割
String[] camels = name.split("_");
for (String camel : camels) {
// 跳过原始字符串中开头结尾的下换线或双重下划线
if (camel.isEmpty()) {
continue;
}
// 处理真正的驼峰片段
if (result.length() == 0) {
// 第一个驼峰片段全部字母都小写
result.append(camel.toLowerCase());
} else {
// 其他的驼峰片段首字母大写
result.append(camel.substring(0, 1).toUpperCase());
result.append(camel.substring(1).toLowerCase());
}
}
return result.toString();
}
/**
* 将驼峰命名转化成下划线
*
* @param para
* @return
*/
public static String camelToUnderline(String para) {
if (para.length() < 3) {
return para.toLowerCase();
}
StringBuilder sb = new StringBuilder(para);
int temp = 0;//定位
//从第三个字符开始 避免命名不规范
for (int i = 2; i < para.length(); i++) {
if (Character.isUpperCase(para.charAt(i))) {
sb.insert(i + temp, "_");
temp += 1;
}
}
return sb.toString().toLowerCase();
}
public static String wrap(String identifier) {
if (identifier.startsWith("`")) {
return identifier;
}
return String.join("", "`", identifier, "`");
}
@SuppressWarnings("unchecked")
public static <T, R> R cast(T t) {
return (R) t;
}
}

View File

@ -0,0 +1,98 @@
package group.flyfish.fluent.utils.text;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.function.Function;
import java.util.stream.Collectors;
/**
* 模板编译器
* 基于匹配模式将字符串替换为表达式通过sb返回结果
*/
public class TemplateCompiler {
// 暂存静态文本
private final StringBuilder sb = new StringBuilder();
// 负责存储结果
private final List<Object> parts = new ArrayList<>();
// 负责存储标记
private int start = -1;
private TemplateCompiler(String code) {
parse(code);
}
/**
* 静态编译编译为Function
*
* @param template 模板
* @return 编译结果
*/
public static DynamicValue compile(String template) {
return new TemplateCompiler(template).getCompiled();
}
/**
* 动态编译直接获得结果
*
* @return 结果
*/
public static String explain(String template, Map<String, Object> args) {
return new TemplateCompiler(template).getCompiled().apply(args);
}
private void parse(String code) {
int length = code.length();
for (int i = 0; i < length; i++) {
char c = code.charAt(i);
// 在栈作用中
if (start != -1) {
if (c == '}') {
parts.add(resolveExpression(code.substring(start, i)));
start = -1;
}
} else {
// 匹配到开始标记
if (c == '{') {
start = i + 1;
// 文本缓存非空拼接之前的文本
if (sb.length() != 0) {
parts.add(sb.toString());
sb.delete(0, sb.length());
}
} else {
sb.append(c);
}
}
}
// 拼接完成释放sb资源
if (sb.length() != 0) {
parts.add(sb.toString());
sb.delete(0, sb.length());
}
}
/**
* 获取编译后的内容
*
* @return 结果
*/
public DynamicValue getCompiled() {
return context -> parts.stream().map(obj -> {
if (obj instanceof DynamicValue) {
return ((DynamicValue) obj).apply(context);
}
return String.valueOf(obj);
}).collect(Collectors.joining());
}
private DynamicValue resolveExpression(String key) {
return context -> String.valueOf(context.get(key));
}
@FunctionalInterface
public interface DynamicValue extends Function<Map<String, Object>, String> {
}
}

View File

@ -0,0 +1,19 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>fluent-sql</artifactId>
<groupId>group.flyfish.framework</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>fluent-sql-jdbctemplate</artifactId>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
</properties>
</project>

181
pom.xml Normal file
View File

@ -0,0 +1,181 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>group.flyfish.framework</groupId>
<artifactId>fluent-sql</artifactId>
<packaging>pom</packaging>
<version>1.0-SNAPSHOT</version>
<modules>
<module>fluent-sql-core</module>
<module>fluent-sql-jdbctemplate</module>
</modules>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
<lombok.version>1.18.24</lombok.version>
<jackson.version>2.13.3</jackson.version>
<spring.version>5.3.22</spring.version>
</properties>
<!-- 开发人员信息 -->
<developers>
<developer>
<name>wangyu</name>
<email>wybaby168@gmail.com</email>
<organization>http://flyfish.group</organization>
<timezone>+8</timezone>
</developer>
</developers>
<repositories>
<repository>
<id>central</id>
<url>https://maven.aliyun.com/nexus/content/groups/public</url>
</repository>
</repositories>
<distributionManagement>
<snapshotRepository>
<id>ossrh</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</snapshotRepository>
<repository>
<id>ossrh</id>
<url>https://oss.sonatype.org/service/local/staging/deploy/maven2/</url>
</repository>
</distributionManagement>
<dependencies>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>${lombok.version}</version>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.36</version>
</dependency>
<dependency>
<groupId>javax.persistence</groupId>
<artifactId>persistence-api</artifactId>
<version>1.0.2</version>
</dependency>
</dependencies>
</dependencyManagement>
<profiles>
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<show>package</show>
<doclint>none</doclint>
</configuration>
<executions>
<execution>
<id>attach-javadocs</id>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>sign-artifacts</id>
<phase>verify</phase>
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
<version>1.6.7</version>
<extensions>true</extensions>
<configuration>
<serverId>ossrh</serverId>
<nexusUrl>https://oss.sonatype.org/</nexusUrl>
<autoReleaseAfterClose>true</autoReleaseAfterClose>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<executions>
<execution>
<id>attach-sources</id>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>${maven.compiler.source}</source>
<target>${maven.compiler.target}</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
<overWriteIfNewer>true</overWriteIfNewer>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>