Skip to content

Commit db76ce3

Browse files
committed
♻️ refactor(frontend): Fix ESLint warnings & small improvements
1 parent a269edb commit db76ce3

File tree

7 files changed

+25
-3
lines changed

7 files changed

+25
-3
lines changed

src/frontend-nextjs/app/api/post/route.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ import { createNewPost } from '@/services/api/CreatePostService';
1212
export async function POST(request: Request): Promise<NextResponse> {
1313
const body = await request.json();
1414
let header = request.headers.get('header');
15+
1516
if (header) {
1617
header = Buffer.from(header, 'latin1').toString('utf-8');
1718
}

src/frontend-nextjs/app/post/page.tsx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,10 +30,10 @@ function SinglePost() {
3030
<PostComponent
3131
body={postData.body}
3232
imageUrl={postData.imageUrl}
33+
isSpamPredictedLabel={postData.isSpamPredictedLabel}
3334
postId={postData.postId}
3435
timestamp={postData.timestamp}
3536
username={postData.username}
36-
isSpamPredictedLabel={postData.isSpamPredictedLabel}
3737
/>
3838
)}
3939
</div>

src/frontend-nextjs/components/SwaggerUIReact.tsx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ const HideTryItOutPlugin = () => ({
2121

2222
function ReactSwagger({ spec }: Props) {
2323
const specWithoutServers = { ...spec };
24+
2425
delete specWithoutServers.servers;
2526

2627
return <SwaggerUI spec={specWithoutServers} plugins={[HideAuthorizePlugin, HideTryItOutPlugin]} />;

src/frontend-nextjs/components/Timeline/PostSpamPrediction.tsx

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ export interface PostSpamPredictionProps {
1515

1616
export function PostSpamPrediction(props: Readonly<PostSpamPredictionProps>) {
1717
const { isLoggedIn } = useCheckLogin();
18+
1819
if (props.isSpamPredictedLabel == null) return null;
1920

2021
const color = props.isSpamPredictedLabel ? 'danger' : 'primary';

src/frontend-nextjs/components/Timeline/SpamPredictionUserRating.tsx

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,15 +23,16 @@ export function SpamPredictionUserRating(props: Readonly<SpamPredictionUserRatin
2323
return (
2424
<div>
2525
<Button
26-
className=' text-default-600 bg-transparent'
26+
className='bg-transparent text-default-600 px-2 min-w-0 gap-1'
2727
name='upvoteSpamRating'
2828
onPress={() => handleSpamPredictionUpvote()}
2929
>
3030
<p>{spamPredictionUserRatingData?.spamPredictionUserUpvotes}</p>
3131
{spamPredictionUserRatingData?.isUpvotedByUser ? <BsHandThumbsUpFill /> : <BsHandThumbsUp />}
3232
</Button>
33+
3334
<Button
34-
className=' text-default-600 bg-transparent'
35+
className='bg-transparent text-default-600 px-2 min-w-0 gap-1'
3536
name='downvoteSpamRating'
3637
onPress={() => handleSpamPredictionDownvote()}
3738
>

src/microblog-service/README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -60,3 +60,5 @@ To get more information about the JAEGER config options, see https://www.jaegert
6060
| JAEGER_SAMPLER_TYPE | const | (optional) Set to const to get all traces |
6161
| JAEGER_SAMPLER_PARAM | 1 | (optional) Set to 1 while sampler is const to get all traces |
6262
| USER_AUTH_SERVICE_ADDRESS | unguard-user-auth-service | Change to hostname/IP of user-auth-service instance |
63+
| RAG_SERVICE_ADDRESS | unguard-rag-service | Change to hostname/IP of rag-service instance |
64+
| RAG_SERVICE_PORT | 8000 | Change to port number of rag-service instance |

src/rag-service/README.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,22 @@ A microservice for spam classification using Retrieval-Augmented Generation (RAG
1010
- **[Langdock](https://www.langdock.com)**: OpenAI LLM and embeddings
1111
- **[Ollama](https://ollama.com/)**: Local open-source LLM and embeddings
1212

13+
## Environment Variables
14+
| Name | Example Value | Description |
15+
|-------------------------------------------------|----------------------------------|-----------------------------------------------------------------------------------|
16+
| LLM_MODEL | llama3.2:latest | The LLM model that will be used by the RAG service |
17+
| EMBEDDINGS_MODEL | nomic-embed-text | The embeddings model that will be used by the RAG service |
18+
| MODEL_PROVIDER | Ollama | Can be either "Ollama" or "LangDock" |
19+
| MODEL_PROVIDER_BASE_URL | http://localhost:11434 | Base url to your model |
20+
| LANGDOCK_API_KEY | <your_langdock_api_key> | (optional) Langdock API key, only needed when using LangDock |
21+
| EVALUATE_AFTER_ATTACK | false | Configure whether the RAG service performance should be evaluated after an attack |
22+
| LIMIT_EVALUATION_SAMPLES | 0 | |
23+
| LIMIT_KEYWORD_ATTACK_SUCCESS_EVALUATION_SAMPLES | 0 | Change to hostname/IP of rag-service instance |
24+
| USE_DATA_POISONING_DETECTION | falses | Change to port number of rag-service instance |
25+
| DATA_POISONING_DETECTION_STRATEGY | embedding_similarity_entry_level | The data poisoning detection strategy to use |
26+
| LABEL_CONSISTENCY_DETECTION_DECISION_VARIANT | majority_voting | The variant of the label consistency detection strategy to use |
27+
28+
1329
## Getting Started for running the RAG Service locally
1430

1531
When starting for the first time, you need to create a Virtual Environment, activate it and install the dependencies:

0 commit comments

Comments
 (0)