import db_helper 
import requests
import datetime
from bs4 import BeautifulSoup
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from env import cookies
baseUrl="http://www.cwl.gov.cn"
def writeWinningNumers(wapItem):
    detailUrl = f"{baseUrl}{wapItem['detailsLink']}"
    print(f'#detailUrl:{detailUrl}')
    option = webdriver.ChromeOptions()
    s = Service(executable_path=r'D:\pf\Lib\site-packages\selenium\webdriver\chrome\chromedriver.exe')
    driver = webdriver.Chrome(service=s)
    driver.get(detailUrl)
    for cookie in cookies:
        driver.add_cookie(cookie)
    driver.refresh()
    soup = BeautifulSoup(driver.page_source, 'html.parser')
    qius = soup.select(".qiu-item.qiu-item-big")
    qiu7 = qius[:7]
    drawDate = wapItem['date'][:10]
    print(f'1:{qiu7[0].text}')
    print(f'2:{qiu7[1].text}')
    print(f'3:{qiu7[2].text}')
    print(f'4:{qiu7[3].text}')
    print(f'5:{qiu7[4].text}')
    print(f'6:{qiu7[5].text}')
    print(f'7:{qiu7[6].text}')
    sqlTemp = f"INSERT into cp.WinningNumbers (type,Period,DrawDate,Number1,Number2,Number3,Number4,Number5,Number6,Number7) VALUES(1,'{wapItem['code']}','{drawDate}','{qiu7[0].text.strip()}','{qiu7[1].text.strip()}','{qiu7[2].text.strip()}','{qiu7[3].text.strip()}','{qiu7[4].text.strip()}','{qiu7[5].text.strip()}','{qiu7[6].text.strip()}'); "
    driver.quit()
    #sqlStr += sqlTemp
    with db_helper.buildDbHelper() as db:
        #查询之前判断是否存在
        records= db.executeQuerySql(f"select * from  cp.WinningNumbers where Period='{wapItem['code']}'; ")
        if not records:
            db.executeCommit(sqlTemp)
        else:
            print(f"check data {wapItem['code']} exists")
    pass
#查询官网最后一条数据
def getWapLast():
    wapUrl="http://www.cwl.gov.cn/cwl_admin/front/cwlkj/search/kjxx/findDrawNotice?name=ssq&issueCount=30&issueStart=&issueEnd=&dayStart=&dayEnd="
    headers = {
        'User-Agent':'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.5005.63 Safari/537.36'
    }
    date_res = requests.get(wapUrl,headers=headers).json()
    results = date_res['result']
    results.sort(key=lambda x: x['code'])
    return results
#查询数据库最后一条
def getDbLast():
    with db_helper.buildDbHelper() as db:
        sql="select * from WinningNumbers order by DrawDate desc LIMIT 1; "
        records = db.executeQuerySql(sql)
        if records and len(records)>=1:
            return records[0]
        else :
            return None


#查询网页最后一条
#比较两条的日期如果数据库较早，更新最新的数据到数据库，
def process():
    dbLast= getDbLast()
    print(dbLast)
    wapList=getWapLast()
    print(wapList)
    if wapList and len(wapList)>=1 and dbLast:
        needDate=None
        for wapItem in wapList:
            wapDateStr= wapItem['date'][:10]
            wapDate= datetime.date(int(wapDateStr[:4]),int(wapDateStr[5:7]),int(wapDateStr[8:10]))
            if(dbLast[3]<wapDate):
                writeWinningNumers(wapItem)



if __name__=='__main__':
    process()
