Skip to content

Conversation

@1985312383
Copy link
Contributor

Description

(CI/CD的构建确认构建成功)
三次commit分别对应如下:

  1. 修复前次pr中的一个瑕疵(improvement)(headless | chat | launchers | webapp) Added assistant function! Call LLM to optimize SQL code performance before final SQL execution #2306。修正物理校正SQL和最终执行SQL显示一致的问题,现改为分别显示为物理校正前,和最终执行SQL
  2. 智能识别只能通过写死的逻辑,在线调用默认gpt-4o-mini。修改为可以调用大模型管理界面所有大语言模型,在线调用默认gpt-4o-mini作为兜底,方便只在本地调取本地模型或其他在线模型,同时支持自定义prompt [Enhancement] 智能填充没法切换模型和自定义提示词 #2312
  3. 修复智能识别得到的别名,无法在问答中被转换的问题 [Bug] 维度值别名未生效 #2021 [Bug] 复杂查询CTE中复合指标未替换;group by中表有别名时,物理SQL错误; #1814 [Bug] 解释的时候报错指标和维度数量不匹配 #2199

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)

How Has This Been Tested?

  • 已在本地验证前端显示
  • 已验证openAI协议下,gpt-4o-mini和glm-4-flash模型的提示词,指标智能识别和维度智能识别都可以正常识别填充,并且落表,最终界面如图 [Enhancement] 智能填充没法切换模型和自定义提示词 #2312
  • 我所使用的别名修改后,可以正常的在问答对话流程中被识别为相应的Schema,并且后续相应物理转换过程也匹配到对应SQL实际列

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have added tests that prove my fix is effective or that my feature works
  • Any dependent changes have been merged and published in downstream modules

修正物理校正SQL和最终执行SQL显示一致的问题,现改为分别显示为物理校正前,和最终执行SQL。以及做修正内容是否为空的提前判断操作
解决了只能通过智能识别写死的逻辑,避免只能通过在线调用默认gpt-4o-mini,可以调用其他大语言模型,也可以只在本地调取,同时支持自定义prompt
1.buildOntologyQuery方法只检查指标的name和bizName,但没有检查alias
2.如果S2SQL中使用的是指标的别名,那么fields.contains(m.getName())和fields.contains(m.getBizName())都会返回false
3.这导致该指标没有被添加到ontologyQuery中
4.后续的convertNameToBizName步骤就无法找到对应的映射关系

修改了buildOntologyQuery方法,在四个关键位置都添加了别名检查逻辑:
指标匹配:检查指标的name、bizName和alias
维度匹配:检查维度的name、bizName和alias
字段移除:除了移除name和bizName,还要移除所有匹配的alia
@1985312383 1985312383 changed the title (fix | improvement) fix the aliases obtained by intelligent recognition could not be converted in Q&A. modify to allow all LLMs to be called on LLM management page (fix | improvement) fix the aliases obtained by intelligent recognition could not be converted in Q&A. modify to allow all LLMs to be called on LLM management page #2312 Jun 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant