Bot Crawler Advent Calendar 2016 probably because I participated in Crawler / Web Scraping Advent Calendar 2016 I was invited to (/ advent-calendar / 2016 / bot), so I decided to write something, so I made a LINE BOT with a feeling that I left behind in 2016.
I made a BOT to introduce Apple's refurbished products.
It has the following features.
--Randomly reply to Apple refurbished product pages, including text entered using the Reply API --Enter "Mac" and search for products that contain "Mac" --If the search is not hit, the entered text will be replied as it is.
If you are kind enough to try using it, please try "Add friend" from the QR code below.
The final code looks like this:
from __future__ import print_function
import requests
import json
import os
import boto3
import random
print('Loading function')
LINE_API_ENDPOINT = 'https://api.line.me/v2/bot/message/reply'
LINE_API_HEADERS = {
'Authorization': 'Bearer ' + os.environ['LINE_CHANNEL_ACCESS_TOKEN'],
'Content-type': 'application/json'
}
def lambda_handler(event, context):
for event in event['events']:
reply_token = event['replyToken']
message = event['message']
payload = {
'replyToken': reply_token,
'messages': []
}
items = get_items_by_keyword(message['text'])
if len(items) == 0:
payload['messages'].append({
'type': 'text', 'text': message['text']
})
else:
item = items[0]
payload['messages'].append({
'type': 'text', 'text': item['title'] + item['price'] + item['link']
})
response = requests.post(LINE_API_ENDPOINT, headers=LINE_API_HEADERS, data=json.dumps(payload))
print(response.status_code)
def get_items_by_keyword(keyword=None):
key = 'items.json'
s3 = boto3.client('s3')
response = s3.get_object(Bucket='apple-refurbished', Key=key)
items = json.load(response['Body'])
result = []
for item in items:
if item['title'].find(keyword) != -1:
result.append(item)
random.shuffle(result)
return result
Messageing API -> Reply Message Set the endpoint, header, etc. by referring to API Reference.
Channel Access Token is set as an environment variable from the Lambda setting screen.
requests
is not installed, so upload it with the ZIP fileInstall requests
directly under the project directory.
$ cd /path/to/project
$ pip install requests -t .
The directory tree looks like this:
.
├── lambda_function.py
├── requests
│ ├── __init__.py
│ ├── ...
└── requests-2.12.4.dist-info
├── ...
Compress the source code and required external libraries into a ZIP file and upload it from the AWS Lambda web console.
Reference) Create Deployment Package (Python) --AWS Lambda
Make sure to save the result of crawling the following URL in S3 in JSON format on a regular basis, and read the JSON file from the lambda function.
I don't think it's possible with this story, but if you have any good ideas, I'd like to implement them and enter them in LINE BOT AWARDS. ..
See you soon.
Recommended Posts