@@ -87,8 +87,8 @@ graph_config = {
87
87
88
88
# Create the SmartScraperGraph instance
89
89
smart_scraper_graph = SmartScraperGraph(
90
- prompt = " Find some information about what does the company do, the name and a contact email. " ,
91
- source = " https://scrapegraphai. com/ " ,
90
+ prompt = " Extract me all the news from the website " ,
91
+ source = " https://www.wired. com" ,
92
92
config = graph_config
93
93
)
94
94
@@ -100,10 +100,20 @@ print(json.dumps(result, indent=4))
100
100
The output will be a dictionary like the following:
101
101
102
102
``` python
103
- {
104
- " company" : " ScrapeGraphAI" ,
105
- " name" : " ScrapeGraphAI Extracting content from websites and local documents using LLM" ,
106
- " contact_email" :
" [email protected] "
103
+ " result" : {
104
+ " news" : [
105
+ {
106
+ " title" : " The New Jersey Drone Mystery May Not Actually Be That Mysterious" ,
107
+ " link" : " https://www.wired.com/story/new-jersey-drone-mystery-maybe-not-drones/" ,
108
+ " author" : " Lily Hay Newman"
109
+ },
110
+ {
111
+ " title" : " Former ByteDance Intern Accused of Sabotage Among Winners of Prestigious AI Award" ,
112
+ " link" : " https://www.wired.com/story/bytedance-intern-best-paper-neurips/" ,
113
+ " author" : " Louise Matsakis"
114
+ },
115
+ ...
116
+ ]
107
117
}
108
118
```
109
119
There are other pipelines that can be used to extract information from multiple pages, generate Python scripts, or even generate audio files.
0 commit comments