Skip to content

Commit 7b611b4

Browse files
authoredOct 11, 2023
llmodel: print an error if the CPU does not support AVX (nomic-ai#1499)
1 parent f81b4b4 commit 7b611b4

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed
 

‎gpt4all-backend/llmodel.cpp

+3-2
Original file line numberDiff line numberDiff line change
@@ -121,9 +121,10 @@ const LLModel::Implementation* LLModel::Implementation::implementation(const cha
121121
}
122122

123123
LLModel *LLModel::Implementation::construct(const std::string &modelPath, std::string buildVariant) {
124-
125-
if (!has_at_least_minimal_hardware())
124+
if (!has_at_least_minimal_hardware()) {
125+
std::cerr << "LLModel ERROR: CPU does not support AVX\n";
126126
return nullptr;
127+
}
127128

128129
// Get correct implementation
129130
const Implementation* impl = nullptr;

0 commit comments

Comments
 (0)
Please sign in to comment.