Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add:Toxic Comment Classification #1198

Open
KowshikaSinivasan opened this issue Jan 3, 2025 · 4 comments
Open

Add:Toxic Comment Classification #1198

KowshikaSinivasan opened this issue Jan 3, 2025 · 4 comments
Assignees

Comments

@KowshikaSinivasan
Copy link

Description:
I propose adding a Toxic Comment Classification feature that classifies comments as "toxic," or "not" The implementation will.

Solution:
Preprocess the text (remove stopwords, punctuation, lowercase).
Use TF-IDF for feature extraction.
Train a simple classifier.
Evaluate using accuracy and F1-score.

Alternatives:
Pretrained Models (e.g., BERT): These are more complex and require higher resources. Starting with a simpler approach is easier for beginners.
Manual Moderation: It’s time-consuming and not scalable compared to an automated model.

Kindly assign me this issue.

Copy link

github-actions bot commented Jan 3, 2025

Thanks for creating the issue,Please read the Pinned issued first and Readme.md in each Pull Request you made. Keep learning...

@KowshikaSinivasan
Copy link
Author

I am a swoc25 contributor.
So pls assign me this issue under swoc25.

@Vaibhav2154
Copy link
Collaborator

@KowshikaSinivasan
I've assigned you the issue.
Reach out if you need any help.
Add your project with proper documentation

@kashifalikhan36
Copy link

I'm interested, please assign me this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants