Skip to content

Conversation

@ikawrakow
Copy link
Owner

@Nexesenex Does this fix your issue?

@Nexesenex
Copy link
Contributor

Nexesenex commented Dec 27, 2025

Not yet, @ikawrakow.

I now get this in the file llama.cpp:

Severity	Code	Description	Project	Path	File	Line	Column	Source	Suppression State	Details
Error	C2088	built-in operator '<<' cannot be applied to an operand of type 'std::stringstream'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7137	1	Build		
Error	C2088	built-in operator '<<' cannot be applied to an operand of type 'std::stringstream'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7166	1	Build		
Error	C2088	built-in operator '<<' cannot be applied to an operand of type 'std::stringstream'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7168	1	Build		
Error	C2088	built-in operator '<<' cannot be applied to an operand of type 'std::stringstream'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7172	1	Build		
Error	C2088	built-in operator '<<' cannot be applied to an operand of type 'std::basic_ostream<char,std::char_traits<char>>'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7153	1	Build		
Error	C2280	'std::basic_ostream<char,std::char_traits<char>> &std::operator <<<std::char_traits<char>>(std::basic_ostream<char,std::char_traits<char>> &,const char8_t *)': attempting to reference a deleted function	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7137	1	Build		
Error	C2280	'std::basic_ostream<char,std::char_traits<char>> &std::operator <<<std::char_traits<char>>(std::basic_ostream<char,std::char_traits<char>> &,const char8_t *)': attempting to reference a deleted function	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7153	1	Build		
Error	C2280	'std::basic_ostream<char,std::char_traits<char>> &std::operator <<<std::char_traits<char>>(std::basic_ostream<char,std::char_traits<char>> &,const char8_t *)': attempting to reference a deleted function	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7166	1	Build		
Error	C2280	'std::basic_ostream<char,std::char_traits<char>> &std::operator <<<std::char_traits<char>>(std::basic_ostream<char,std::char_traits<char>> &,const char8_t *)': attempting to reference a deleted function	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7168	1	Build		
Error	C2280	'std::basic_ostream<char,std::char_traits<char>> &std::operator <<<std::char_traits<char>>(std::basic_ostream<char,std::char_traits<char>> &,const char8_t *)': attempting to reference a deleted function	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	7172	1	Build		
Error	C2664	'bool llama_chat_detect_template::<lambda_1>::operator ()(const char *) const': cannot convert argument 1 from 'const char8_t [9]' to 'const char *'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	6821	1	Build		
Error	C2664	'bool llama_chat_detect_template::<lambda_1>::operator ()(const char *) const': cannot convert argument 1 from 'const char8_t [28]' to 'const char *'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	6826	1	Build		
Error	C2664	'bool llama_chat_detect_template::<lambda_1>::operator ()(const char *) const': cannot convert argument 1 from 'const char8_t [18]' to 'const char *'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	6826	1	Build		
Error	C2664	'bool llama_chat_detect_template::<lambda_1>::operator ()(const char *) const': cannot convert argument 1 from 'const char8_t [13]' to 'const char *'	Q:\GitHub\ik_llama.cpp.fks\out\build\x64-Release-MMQ\ik_llama.cpp.fks	Q:\GitHub\ik_llama.cpp.fks\src	Q:\GitHub\ik_llama.cpp.fks\src\llama.cpp	6826	1	Build		

@ikawrakow
Copy link
Owner Author

That's not actual ik_llama.cpp but rather your modified fork, or?

@Nexesenex
Copy link
Contributor

Nexesenex commented Dec 27, 2025

That's not actual ik_llama.cpp but rather your modified fork, or?

I tried to compile the main branch, same result. :/

2025-12-27 21_27_02-Fix Windows build by ikawrakow · Pull Request #1097 · ikawrakow_ik_llama cpp — M 2025-12-27 21_30_09-ik_llama cpp fks - llama cpp - Microsoft Visual Studio

@ikawrakow
Copy link
Owner Author

@Thireus I see you are trying to fix the Windows build of the main branch. Have you tried this PR?

@Thireus
Copy link
Contributor

Thireus commented Dec 28, 2025

Ah! Thanks for pointing this out: yes I've been trying to switch to MSVC flags to use the experimental version of openmp but without success so far. I'll try this PR.

@Thireus
Copy link
Contributor

Thireus commented Dec 28, 2025

Looking good so far, it's compiling. We'll see if it produces release builds in a few hours - https://github.com/Thireus/ik_llama.cpp/actions/runs/20555781303

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants