Skip to content
Projects
Groups
Snippets
Help
Loading...
Sign in
Toggle navigation
M
meta_base_code
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
黎涛
meta_base_code
Commits
1a051389
Commit
1a051389
authored
Sep 27, 2020
by
litaolemo
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
update
parent
76645a6a
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
19 additions
and
3 deletions
+19
-3
out_put_user_post_each_strategy.py
output/out_put_user_post_each_strategy.py
+19
-3
No files found.
output/out_put_user_post_each_strategy.py
View file @
1a051389
...
...
@@ -197,7 +197,7 @@ select t2.device_id from
WHERE spam_pv.device_id IS NULL
and dev.device_id is null
"""
.
format
(
today_str
=
today_str
,
last_30_day_str
=
last_30_
day_str
)
"""
.
format
(
today_str
=
today_str
,
last_30_day_str
=
to
day_str
)
print
(
huidu_device_id_sql
)
huidu_device_id_df
=
spark
.
sql
(
huidu_device_id_sql
)
...
...
@@ -331,7 +331,10 @@ for res in sql_res:
if
transaction_type
not
in
second_demands_id_count
:
second_demands_id_count
[
transaction_type
]
=
{}
if
tag_id
in
second_demands_id_count
[
transaction_type
]:
second_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
=
1
try
:
second_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
+=
1
except
:
second_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
=
1
else
:
second_demands_id_count
[
transaction_type
][
tag_id
]
=
{}
second_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
=
1
...
...
@@ -360,7 +363,10 @@ for res in sql_res:
if
transaction_type
not
in
projects_demands_id_count
:
projects_demands_id_count
[
transaction_type
]
=
{}
if
tag_id
in
projects_demands_id_count
[
transaction_type
]:
projects_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
=
1
try
:
projects_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
+=
1
except
:
projects_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
=
1
else
:
projects_demands_id_count
[
transaction_type
][
tag_id
]
=
{}
projects_demands_id_count
[
transaction_type
][
tag_id
][
int
(
card_id
)]
=
1
...
...
@@ -399,8 +405,13 @@ for tag_id in second_demands_tag_count:
for
transaction_type
in
second_demands_id_count
:
if
second_demands_id_count
[
transaction_type
]
.
get
(
tag_id
):
temp_dict
[
transaction_type
]
=
len
(
second_demands_id_count
[
transaction_type
][
tag_id
])
try
:
temp_dict
[
transaction_type
+
"_pv"
]
=
sum
(
second_demands_id_count
[
transaction_type
][
tag_id
]
.
values
())
except
:
temp_dict
[
transaction_type
+
"_pv"
]
=
0
else
:
temp_dict
[
transaction_type
]
=
0
temp_dict
[
transaction_type
+
"_pv"
]
=
0
second_demands_csv_list
.
append
(
temp_dict
)
print
(
temp_dict
)
second_demands_data
=
pandas
.
DataFrame
(
second_demands_csv_list
)
...
...
@@ -433,8 +444,13 @@ for tag_id in projects_demands_tag_count:
for
transaction_type
in
projects_demands_id_count
:
if
projects_demands_id_count
[
transaction_type
]
.
get
(
tag_id
):
temp_dict
[
transaction_type
]
=
len
(
projects_demands_id_count
[
transaction_type
][
tag_id
])
try
:
temp_dict
[
transaction_type
+
"_pv"
]
=
sum
(
projects_demands_id_count
[
transaction_type
][
tag_id
]
.
values
())
except
:
temp_dict
[
transaction_type
+
"_pv"
]
=
0
else
:
temp_dict
[
transaction_type
]
=
0
temp_dict
[
transaction_type
+
"_pv"
]
=
0
projects_csv_list
.
append
(
temp_dict
)
print
(
temp_dict
)
projects_data
=
pandas
.
DataFrame
(
projects_csv_list
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment