Baselight

500 Essay Prompts - Gemini-flash

For the LLMs - You Can't Please Them All Competition

@kaggle.matthewsfarmer_500_essay_prompts_gemini_flash

About this Dataset

500 Essay Prompts - Gemini-flash

Generated with the following prompt:

import os
import google.generativeai as genai

genai.configure(api_key=GOOGLE_AI_API)

generation_config = {
  "temperature": 1,
  "top_p": 0.9,
  "top_k": 40,
  "max_output_tokens": 8192,
  "response_mime_type": "text/plain",
}

model = genai.GenerativeModel(
  model_name="gemini-1.5-flash",
  generation_config=generation_config,
  system_instruction="You are an intelligent assistant.",
)

chat_session = model.start_chat(
  history=[
  ]
)

domains = [
  "Science",
  "Technology",
  "Engineering",
  "Mathematics",
  "History",
  "Literature",
  "Philosophy",
  "Art",
  "Music",
  "Politics",
  "Economics",
  "Psychology",
  "Sociology",
  "Anthropology",
  "Biology",
  "Chemistry",
  "Physics",
  "Astronomy",
  "Geology",
  "Geography",
  "Environmental Science",
  "Computer Science",
  "Healthcare",
  "Medicine",
  "Law",
  "Business",
  "Education",
  "Sports",
  "Games",
  "Fashion",
  "Food",
  "Photography",
  "Film",
  "Theater",
  "Dance",  
  "Television",
  "Radio",
  "Journalism",
  "Social Media",
  "Advertising",
  "Marketing",
  "Public Relations",
  "Politics",
  "Government",
  "Military",
  "Intelligence",
]

responses = []

for i in range(500):
  domain = domains[i % len(domains)]
  response = chat_session.send_message(f"Generate a essay prompt from the following domain: {domain}. The prompt should be about 10-20 words long. You are creating a list of possible essay questions/request so be creative. Ensure that the request is answerable without the need for additional input.")
  responses.append(response.text)
df = pd.DataFrame(responses, columns=["topic"])
df.to_csv("essay_prompts.csv", index=False)

Share link

Anyone who has the link will be able to view this.