Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

prompt_config.py 1.4 KB

You have to be logged in to leave a comment. Sign In
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
  1. import json
  2. import sys
  3. def prompt_with_config(context):
  4. """
  5. A Python prompt function that returns both prompt content and configuration.
  6. """
  7. vars = context["vars"]
  8. provider = context.get("provider", {})
  9. # Dynamic configuration based on the topic
  10. if vars["topic"] == "the Roman Empire" or vars["topic"] == "bob dylan":
  11. # Complex topics need more elaboration
  12. temperature = 0.8
  13. max_tokens = 200
  14. else:
  15. # Simple topics can be more constrained
  16. temperature = 0.4
  17. max_tokens = 100
  18. # Return structured object with both prompt and config
  19. return {
  20. "prompt": [
  21. {
  22. "role": "system",
  23. "content": f"You are a Python expert assistant using {provider.get('id', 'an AI')} model. Be technical and precise.",
  24. },
  25. {
  26. "role": "user",
  27. "content": f"Explain {vars['topic']} as if it were a Python library. What would its main classes and methods be?",
  28. },
  29. ],
  30. "config": {
  31. "temperature": temperature,
  32. "max_tokens": max_tokens,
  33. "presence_penalty": 0.1,
  34. "frequency_penalty": 0.3,
  35. },
  36. }
  37. if __name__ == "__main__":
  38. # When executed directly, run the prompt_with_config function
  39. print(json.dumps(prompt_with_config(json.loads(sys.argv[1]))))
Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...