Ricardo Čerljenko
Posted on February 10, 2023
Recently I stumbled across OpenAI Moderations API which gives you a way to query OpenAI in order to detect if input text contains inappropriate content such as hate, violence, etc.
API is completly free - you just need to create an OpenAI account and issue a fresh API token.
Since I'm primary a Laravel PHP Framework developer, I decided to make a package which will provide a way to validate request payload fields against OpenAI Moderations API.
Installation
Standard Composer package installation:
composer require rcerljenko/laravel-openai-moderation
Usage
- Publish config and translation files.
php artisan vendor:publish --provider="RCerljenko\LaravelOpenAIModeration\LaravelOpenAIModerationServiceProvider"
Set your OpenAI API key and enable package via newly created config file =>
config/openai.php
Use provided rule with your validation rules.
<?php
namespace App\Http\Requests;
use Illuminate\Foundation\Http\FormRequest;
use RCerljenko\LaravelOpenAIModeration\Rules\OpenAIModeration;
class StoreText extends FormRequest
{
/**
* Determine if the user is authorized to make this request.
*/
public function authorize(): bool
{
return true;
}
/**
* Get the validation rules that apply to the request.
*/
public function rules(): array
{
return [
'text' => ['required', 'string', new OpenAIModeration],
];
}
}
And that's it! Your content can now be validated with powerfull (and yet free) OpenAI Moderations API.
Thank you for reading this! If you've found this interesting, consider leaving a ❤️, 🦄 , and of course, share and comment on your thoughts!
Lloyds is available for partnerships and open for new projects. If you want to know more about us, check us out.
Posted on February 10, 2023
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.