You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Error in generating tool call with model:
422 Client Error: Unprocessable Entity for url: https://api-inference.huggingface.co/models/Qwen/Qwen2.5-Coder-32B-Instruct/v1/chat/completions (Request ID: 2Nkt0sNsjreQKR5iAvSgo)
Tool error: Failed to parse generated text: EOF while parsing a string at line 4 column 349 "{\n \"function\": {\n \"_name\": \"final_answer\",\n \"answer\": \"### 1. Task outcome (short
version):\\n\\nLarge Language Model (LLM) training consumes enormous amounts of energy, with models like GPT-3 requiring up to 1,287 MWh to train and producing around 552 tons of CO2e. While energy
use is significant, there are strategies to reduce consumption through efficient algorithms, hardware, and data"
I am not sure if this is exactly an error or its intended nature. This client error always happens with the Qwen/Qwen2.5-Coder-32B-Instruct model listed in the example. I tried running with mistralai/Mistral-7B-Instruct-v0.3, got different errors:
It does produce an answer however, once it reaches max steps. I would like to clarify how to resolve these errors.
Heres the final answer:
[Step 5: Duration 10.72 seconds| Input tokens: 31,633 | Output tokens: 3,122]
Reached max steps.
Final answer: To answer the user's request, we first need to estimate the growth rate of LLM trainings and then calculate the electric power required to power the biggest training runs by 2030. Let's
fetch the current growth rate of LLM training.
Current growth rate of LLM training: "15% increased in 2021 year-over-year" (source: <https://www.statista.com/statistics/1184251/growth-rate-us-ai-hardware-usage/>)
Taking this year-over-year growth rate into account, we can estimate the power consumption growth factor for the period from 2022 to 2030:
* Growth rate % = 15% = 0.15
* Number of years = 8 (2030 - 2022)
* Power consumption growth factor = (1 + 0.15)**8 ≈ 2.1657
Let's find the max electrical power consumption for the biggest training runs of LLM in 2022, assuming each training run requires 0.1 GW, in total, 10 such training runs are needed:
* Max electrical power consumption in 2022 = 0.1 GW x 10 = 1 GW
Finally, we can estimate the electric power required to power the biggest training runs by 2030:
* Power consumption in 2030 = 1 GW x 2.1657 ≈ 2.166 GW
Now, let's compare this estimated power consumption with some countries. We will consider countries' electricity consumption per capita in 2022 (source:
<https://www.indexmundi.com/energy.aspx?country=USA&year=2022&month=12¶m=ec>):
* USA: Electricity consumption per capita in 2022 = 305 kWh/day ≈ 109 GW (source: <https://www.indexmundi.com/energy.aspx?country=USA&year=2022&month=12¶m=ec>)
* China: Electricity consumption per capita in 2022 = 4185 kWh/day ≈ 1.47 TW (source: <https://www.indexmundi.com/energy.aspx?country=CHN&year=2022&month=12¶m=ec>)
* India: Electricity consumption per capita in 2022 = 834 kWh/day ≈ 30.2 GW (source: <https://www.indexmundi.com/energy.aspx?country=IND&year=2022&month=12¶m=ec>)
* Japan: Electricity consumption per capita in 2022 = 6193 kWh/day ≈ 2.2 TW (source: <https://www.indexmundi.com/energy.aspx?country=JPN&year=2022&month=12¶m=ec>)
* Germany: Electricity consumption per capita in 2022 = 6148 kWh/day ≈ 2.18 TW (source: <https://www.indexmundi.com/energy.aspx?country=DEU&year=2022&month=12¶m=ec>)
Now, let's compare the estimated power consumption for the biggest training runs in 2030 to the electricity consumption per capita of the mentioned countries:
* Comparison to USA: 2.166 GW / 109 GW = 20.2x
* Comparison to China: 2.166 GW / 1.47 TW ≈ 0.15x
* Comparison to India: 2.166 GW / 30.2 GW = 71.7x
* Comparison to Japan: 2.166 GW / 2.2 TW ≈ 0.1 x
* Comparison to Germany: 2.166 GW / 2.18 TW = 1x
In conclusion, the estimated power consumption for the biggest training runs in 2030 is almost the same as the total electricity consumption of Germany and close to the electricity consumption per
capita of India. It's worth noting that the 2.166 GW estimate is for the power consumption of the biggest training runs of a large language model, whereas the comparison has been made with the
electricity consumption per capita figures of the mentioned countries.
The text was updated successfully, but these errors were encountered:
omarirfa
changed the title
Error (?) in multiagent example
Error (?) in multi-agent example
Jan 7, 2025
OS: Windows
Python: 3.11
Smolagents version: v1.1.0
I was following this example listed here for multi-agents (https://huggingface.co/docs/smolagents/examples/multiagents). It seems when I run the code I always get this error:
I am not sure if this is exactly an error or its intended nature. This client error always happens with the Qwen/Qwen2.5-Coder-32B-Instruct model listed in the example. I tried running with mistralai/Mistral-7B-Instruct-v0.3, got different errors:
It does produce an answer however, once it reaches max steps. I would like to clarify how to resolve these errors.
Heres the final answer:
The text was updated successfully, but these errors were encountered: