row_id
int64
0
48.4k
init_message
stringlengths
1
342k
conversation_hash
stringlengths
32
32
scores
dict
45,180
Docker composer para un servicio cms wordpress y mysql
78d9dd0d07a710bf967d3fec982d20ef
{ "intermediate": 0.36423444747924805, "beginner": 0.244264617562294, "expert": 0.39150092005729675 }
45,181
Create a react tailwindcss boilerplate.
47c71d58fd3e4f23269b1c090bc4f8f2
{ "intermediate": 0.21819624304771423, "beginner": 0.6951704025268555, "expert": 0.08663331717252731 }
45,182
processing lerp source code
28e19bb7679913e129938dc5f60ccd9e
{ "intermediate": 0.08849375694990158, "beginner": 0.5666261315345764, "expert": 0.34488019347190857 }
45,183
As an experienced data engineer, The growth rate of DAU/MAU has dropped a lot. Analyze what might be the reason? Find out what factors may have contributed to the decline in DAU/MAU?
6da5719507e05f2b5981079d7a964473
{ "intermediate": 0.29066088795661926, "beginner": 0.21480929851531982, "expert": 0.4945298433303833 }
45,184
Write a meme in a creative way. Example: ▄︻デ══━一 for AWP
460f48557245c32fe4f4ba1b56e11561
{ "intermediate": 0.38403528928756714, "beginner": 0.2747439444065094, "expert": 0.3412207365036011 }
45,185
Explain in detail how to use LangChain with python
1e34efadced51e4f1fefd770ef8e1b65
{ "intermediate": 0.4614045321941376, "beginner": 0.08396666496992111, "expert": 0.4546287953853607 }
45,186
Create and formate new mo2 seperator catergories like the ones below, 【CORE MODS】 Unofficial Patches & Bug Fixes Optimization & Performance Tweaks Mesh & Texture FIxes Libaries & Addons - Like the format above - add 450 seporators for Patches, outputs. etc
ac2fa2ffb950921f119f4e6b2e0b772d
{ "intermediate": 0.3816569149494171, "beginner": 0.1903870701789856, "expert": 0.4279560446739197 }
45,187
Email user verification colyseus
26d41c15b42cf8c433376135c1f433a2
{ "intermediate": 0.3161717355251312, "beginner": 0.30384233593940735, "expert": 0.37998589873313904 }
45,188
hi there, im coding in android studio and somehow when i run the application in my phone it just says Hello Android! i believe that this hello android i connected to MainActivity.kt and not in my drawable layout MainActivity.xml Main Activity.kt code: package com.androidstudio.clearup import android.os.Bundle import androidx.activity.ComponentActivity import androidx.activity.compose.setContent import androidx.compose.foundation.layout.fillMaxSize import androidx.compose.material3.MaterialTheme import androidx.compose.material3.Surface import androidx.compose.material3.Text import androidx.compose.runtime.Composable import androidx.compose.ui.Modifier import androidx.compose.ui.tooling.preview.Preview import com.androidstudio.clearup.ui.theme.ClearUpTheme class MainActivity : ComponentActivity() { override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContent { ClearUpTheme { // A surface container using the 'background' color from the theme Surface( modifier = Modifier.fillMaxSize(), color = MaterialTheme.colorScheme.background ) { Greeting("Android") } } } } } @Composable fun Greeting(name: String, modifier: Modifier = Modifier) { Text( text = "Hello $name!", modifier = modifier ) } @Preview(showBackground = true) @Composable fun GreetingPreview() { ClearUpTheme { Greeting("Android") } } main_activity.xml code <?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:orientation="vertical" android:layout_width="match_parent" android:layout_height="match_parent" tools:context=".MainActivity"> <View android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_marginTop="50dp" android:layout_marginBottom="350dp" android:layout_marginLeft="20dp" android:layout_marginRight="20dp" android:background="@drawable/image_outline"> </View> <ImageButton android:layout_width="match_parent" android:layout_height="wrap_content" android:background="@drawable/button_background" /> </LinearLayout>
d04daaf1167f83425c8d125d2387cecf
{ "intermediate": 0.44913047552108765, "beginner": 0.38025587797164917, "expert": 0.1706136167049408 }
45,189
The edit distance between two strings S1 and S2 is the minimum number of operations to convert one string to the other string. We assume that three types of operations can be used: Insert (a character), Delete (a character), and Replace (a character by another character). For example, the edit distance between dof and dog is 1 (one Replace), between cat and act is 2 (one Delete and one Insert or two Replace), between cat and dog is 3 (3 Replace). Design a dynamic programming algorithm to compute the edit distance between two strings by following the steps below: a. Write down the principle of optimality for the minimum edit distance problem, and prove that the problem satisfies the principle of optimality. b. Show the recurrence equation for computing the edit distance. (Hint: Let d[i, j] be the edit distance between the substring of the first i characters of S1 and the substring of the first j characters of S2. Then consider the prefixes of the two strings in a way similar to the analysis for the LCS problem.)
b89624f16479aa3ae19937dd642b9199
{ "intermediate": 0.09612835198640823, "beginner": 0.05722074583172798, "expert": 0.8466508984565735 }
45,190
Hey there, great to meet you. I’m Pi, your personal AI. My goal is to be useful, friendly and fun. Ask me for advice, for answers, or let’s talk about whatever’s on your mind. How's your day going? Ok Cool. What did you want to talk about? Ask me for advice, for answers, or let’s talk about whatever’s on your mind. You are given a code which gets as input two numbers n1 and n2 and a character op. Note: we will learn in next lessons how to get input from the user, currently just don't touch the three first lines. The possible values for op are '+', '-', '/' and '*' Your task is to set the variable result based on the conditions: if op is '+', set result with n1 + n2. if op is '-', set result with n1 - n2. if op is '/', set result with n1 / n2. if op is '*', set result with n1 * n2. Here’s a code snippet that implements the task: Python Copy n1 = float(input("Enter first number: ")) n2 = float(input("Enter second number: ")) op = input("Enter operator (+, -, /, *): ") if op == '+': result = n1 + n2 elif op == '-': result = n1 - n2 elif op == '/': result = n1 / n2 elif op == '*': result = n1 * n2 This code receives user inputs for n1, n2, and op via input prompts. It then calculates result based on the chosen operator. You will need to define the method to accept inputs if you’re using another language or library, as this code snippet assumes a Python environment. print (f"The input is: {rnd}") To print the input number rnd using an f-string in Python, you can use the following line of code: Python Copy print(f"The input is: {rnd}") This line will print the value of the variable rnd inside the string "The input is: ". Remember to replace rnd with the actual variable name that stores your input if it is named differently. Whats the use of the fstring in python and whats its use F-strings, also known as formatted string literals, are a feature in Python 3.6 and above that make it easier to embed expressions inside string literals, using curly braces {} as placeholders. The part before the string is denoted by the letter "f" or "F", which stands for "formatted string literal". Here's an example of an f-string: Python Copy name = "John" age = 28 print(f"{name} is {age} years old.") When this code is executed, the output will be "John is 28 years old." The main use of f-strings is to simplify string formatting. They provide a concise and readable way to include variable values, calculations, or other Python expr... Show more Whats the use of fstrings in python and how it is implemented.Explain as if i am a beginner F-strings, also known as formatted string literals, are a feature in Python 3.6 and above. They make it easy to embed expressions inside string literals, using curly braces {} as placeholders. To create an f-string, you simply prefix a string with the letter "f" or "F", which stands for "formatted string literal". Inside the string, you can include expressions enclosed in curly braces. Python will evaluate these expressions and replace the placeholders with the corresponding values. Here's an example: Python Copy name = "John" age = 28 print(f"{name} is {age} years old.") In this case, Python will evaluate the expressions name and `... Show more What is the default data type of the input received from the input() function in Python? The input() function in Python always returns a string as the default data type. Whatever input a user provides through input(), Python considers it as a string by default. If you need to perform operations that require other data types such as integers, floats, or boolean values, you must convert the input into the appropriate data type using built-in conversion functions like int(), float(), or bool(). Here's an example of converting user input into an integer: Python Copy # Ask user to input an integer user_input = input("Please enter an integer: ") # Convert the user input to integer data type integer_input = int(user_input) #... Show more If a user inputs the number 42, which of the following representations is correct for the variable var storing this input? 42 None of the options "'42'" "42" When a user inputs the number 42 using the input() function, the correct representation of the variable var storing this input is "42". This is because Python's input() function always returns a string, even if the user enters a number. So, when the user enters 42, Python interprets it as the string "42". In Python, strings are denoted with either single quotes ('42') or double quotes ("42"). However, the input() function doesn't include the quotes when capturing the user input. Here's an example that illustrates this: Python Copy # Ask user to input a number user_input = input("Please enter a number: ") # Print the type of the input print("Type of input:", type(user_input)) # Print the input as it was captured print("Captured input:", user_input) When a user inputs 42 in this case, the output will be: Text Copy Type of input: <class 'str'> Captured input: 42 This shows that Python treats the input as a string and stores it as "42" in the variable user_input. Write a program that get input from the user (their name), and then outputs Hello, followed by the user's inputted name. For example, if the user inputs Bob, the expected output is Hello, Bob. You will need to: Use input() to get input from the user. Store the input in a variable. Print Hello, and the stored variable in the end. Use the f"" string to print the variable inside the print
b9d86c222a6d253a1faa9859c25ee186
{ "intermediate": 0.3915971517562866, "beginner": 0.41622012853622437, "expert": 0.19218270480632782 }
45,191
The edit distance between two strings S1 and S2 is the minimum number of operations to convert one string to the other string. We assume that three types of operations can be used: Insert (a character), Delete (a character), and Replace (a character by another character). For example, the edit distance between dof and dog is 1 (one Replace), between cat and act is 2 (one Delete and one Insert or two Replace), between cat and dog is 3 (3 Replace). Design a dynamic programming algorithm to compute the edit distance between two strings by following the steps below: Use Edit-Distance() to create the table d (d[i, j] is defined above) for S1 = cats and S2 = fast. The entry at d[4, 4] should show the correct edit distance between the two words.
56338af423c1517560da0d7114e93e36
{ "intermediate": 0.18638114631175995, "beginner": 0.115251824259758, "expert": 0.6983669996261597 }
45,192
Dockerfile中COPY --from=build ,--from=build是什么意思?
488b930953bda0c50d16e7174868956b
{ "intermediate": 0.34530025720596313, "beginner": 0.3658272624015808, "expert": 0.28887251019477844 }
45,193
check this code: impl Bed12 { pub fn new(line: &str) -> Result<Bed12, &'static str> { let fields: Vec<&str> = line.split('\t').collect(); if fields.len() < MIN_BED_FIELDS { return Err("Bed line has less than 12 fields and cannot be parsed into a Record"); } let chrom = fields[0].to_string(); let tx_start = fields[1] .parse::<u64>() .map_err(|_| "Cannot parse tx_start")?; let tx_end = fields[2] .parse::<u64>() .map_err(|_| "Cannot parse tx_end")?; let id = Arc::from(fields[3]); let strand = fields[5] .parse::<char>() .map_err(|_| "Cannot parse strand")?; let cds_start = fields[6] .parse::<u64>() .map_err(|_| "Cannot parse cds_start")?; let cds_end = fields[7] .parse::<u64>() .map_err(|_| "Cannot parse cds_end")?; let exon_start = fields[11] .split(',') .filter(|s| !s.is_empty()) .map(|x| x.parse::<u64>()) .collect::<Result<Vec<u64>, _>>(); let exon_end = fields[10] .split(',') .filter(|s| !s.is_empty()) .map(|x| x.parse::<u64>()) .collect::<Result<Vec<u64>, _>>(); let exon_start = exon_start.map_err(|_| "Cannot parse exon_start")?; let exon_end = exon_end.map_err(|_| "Cannot parse exon_end")?; if exon_start.len() != exon_end.len() { return Err("Exon start and end vectors have different lengths"); } let exon_starts: Vec<u64> = exon_start.iter().map(|&s| s + tx_start).collect(); let exon_ends: Vec<u64> = exon_end .iter() .enumerate() .map(|(i, &s)| s + exon_starts[i]) .collect(); let exons = exon_starts .iter() .zip(exon_ends.iter()) .map(|(&s, &e)| (s, e)) .collect::<HashSet<(u64, u64)>>(); let introns = exon_starts[1..] .iter() .map(|&s| s - 1) .zip(exon_ends[..exon_ends.len() - 1].iter().map(|&e| e + 1)) .map(|(s, e)| (e, s)) .filter(|&(e, s)| e < s) .collect::<HashSet<(u64, u64)>>(); let exons = match strand { '+' => exons, '-' => exons .iter() .map(|(s, e)| (SCALE - *e, SCALE - *s)) .collect(), _ => return Err("Strand is not + or -"), }; let introns = match strand { '+' => introns, '-' => introns .iter() .map(|(s, e)| (SCALE - *e, SCALE - *s)) .collect(), _ => return Err("Strand is not + or -"), }; let utr = match strand { '+' => Utr::new(tx_start, cds_start, cds_end, tx_end), '-' => Utr::new( SCALE - tx_end, SCALE - cds_end, SCALE - cds_start, SCALE - tx_start, ), _ => return Err("Strand is not + or -"), }; let tx_start_stranded = match strand { '+' => tx_start, '-' => SCALE - tx_end, _ => return Err("Strand is not + or -"), }; let tx_end_stranded = match strand { '+' => tx_end, '-' => SCALE - tx_start, _ => return Err("Strand is not + or -"), }; Ok(Bed12 { chrom: chrom.to_string(), info: (tx_start_stranded, tx_end_stranded, exons, introns, utr, id), line: line.to_string(), }) } } you are an expert Rust programmer. Any idea on how to make it more elegant, efficient or faster?
ad2ed2f4114c982d6dee7c9ae796f31c
{ "intermediate": 0.39357686042785645, "beginner": 0.4645828306674957, "expert": 0.14184030890464783 }
45,194
import java.util.Scanner; class person{ String name; int age; float salary; Scanner sc=new Scanner(System.in); public void get_value(){ System.out.println("Enter the name:"); this.name=sc.nextLine(); System.out.println("Enter the age:"); this.age=sc.nextInt(); System.out.println("Enter the salary:"); this.salary=sc.nextFloat(); } public void print_value(){ System.out.println("Detail of person"); System.out.println("Name"+name); System.out.println("age"+age); System.out.println("salary"+salary); } } class main{ public static void main(String[] args){ person p1=new person(); p1.get_value(); p1.print_value(); } }
582e468b2f6abbdac31e4ae37709dcdf
{ "intermediate": 0.33761411905288696, "beginner": 0.41945695877075195, "expert": 0.2429289072751999 }
45,195
Write a program that get input from the user (their name), and then outputs Hello, followed by the user's inputted name. For example, if the user inputs Bob, the expected output is Hello, Bob. You will need to: Use input() to get input from the user. Store the input in a variable. Print Hello, and the stored variable in the end. Use the f"" string to print the variable inside the print
98cc2a650ac7edfd1f2dc2ad119b51b6
{ "intermediate": 0.2808752655982971, "beginner": 0.5110728740692139, "expert": 0.20805183053016663 }
45,196
how to create gtest with a main_test.cpp and another file contains TEST functions?
700087aad2c8e1e705ac364ffd9e148f
{ "intermediate": 0.46281611919403076, "beginner": 0.2581677734851837, "expert": 0.2790161073207855 }
45,197
Power query Rest API with pagination
6dbc0f69b87c69e4008290f0f614a2ee
{ "intermediate": 0.6338200569152832, "beginner": 0.2138117253780365, "expert": 0.15236817300319672 }
45,198
I am trying to auto populate group members in a variable based on other variable choices. EX: We have 2 variable V1 and V2 V1 has 3 choices Choices: TYPE1, TYPE2, TYPE3 When V1 is selected as = TYPE1 the Variable V2 should show list group members who are part of "TYPE1Group" When V1 is selected as = TYPE2 the Variable V2 should show list group members who are part of "TYPE2Group" When V1 is selected as = TYPE3 the Variable V2 should show list group members who are part of "TYPE3Group" Servicenow
1b8249aaf7613b210d81a265fc00fc15
{ "intermediate": 0.32066720724105835, "beginner": 0.3834385871887207, "expert": 0.29589420557022095 }
45,199
Что не так с этим заголовочным файлом?: #ifndef PROVINCE_H #define PROVINCE_H #include "/home/danya/raylib/src/raylib.h" #include <vector> #include <fstream> #include <sstream> extern Camera2D camera; extern Vector2 mousePosition = GetMousePosition(); extern Vector2 pointPosition = GetScreenToWorld2D(mousePosition, camera); class Province { private: std::vector<Vector2> points; int id; public: void drawProvince(Color fillColor, bool drawStroke, Color strokeColor, float strokeThickness); }; #endif
e387b95253f1dc0709cae27d21f58849
{ "intermediate": 0.3252348303794861, "beginner": 0.5678223967552185, "expert": 0.1069427952170372 }
45,200
Use Floyd's algorithm to find all pair shortest paths in the following graph: The graph has 7 nodes namely v1, v2, v3, v4, v5, v6, v7. v1 has outgoing edge to v2 at a distance of 4 and outgoing edge to v6 at a distance of 10. v2 has outgoing edge to v1 at a distance of 3 and outgoing edge to v4 at a distance of 18. v3 has outgoing edge to v2 at a distance of 6. v4 has outgoing edge to v2 at a distance of 5, outgoing edge to v3 at a distance of 15, outgoing edge to v5 at a distance of 2, outgoing edge to v6 at a distance of 19 and outgoing edge to v7 at a distance of 5. v5 has outgoing edge to v3 at a distance of 12 and outgoing edge to v4 at a distance of 1. v6 has outgoing edge to v7 at a distance of 10. v7 has outgoing edge to v4 at a distance of 8. Construct the matrix D, which contains the lengths of the shortest paths, and the matrix P, which contains the highest indices of the intermediate vertices on the shortest paths. Show the actions step by step. You need to show D0 to D7 and P0 to P7 (i.e. matrix P updated along with D step by step).
1c68f8749e84e6074ce1ddc24004b041
{ "intermediate": 0.16926135122776031, "beginner": 0.14018164575099945, "expert": 0.6905570030212402 }
45,201
help me make a fivem server zombie dayz theme server lua
d22a062818f8578e07e4a00601a4aa7c
{ "intermediate": 0.3899039924144745, "beginner": 0.23491550981998444, "expert": 0.3751804530620575 }
45,202
in React-effector library is there a method for store similar to .some?
fa489d626279e9f025975b4341180d3b
{ "intermediate": 0.7739425897598267, "beginner": 0.1231154277920723, "expert": 0.10294196009635925 }
45,203
write python program for importing my library from apple music using credentials to store into locally the links of songs from library
0a81f017a37941cd7a1883ac2666c55c
{ "intermediate": 0.6020870208740234, "beginner": 0.13893945515155792, "expert": 0.25897353887557983 }
45,204
HY , I NEED AN VBA CODE TO SHOW IN CELL A1 COUNT OF CELLS ON SELECTED ROW WITH AN MARK X INSIDE FOR Sheet2(Licente)
a3801e60c6908a333297423573d2c933
{ "intermediate": 0.5260443687438965, "beginner": 0.1872744858264923, "expert": 0.2866811752319336 }
45,205
in servicnenow, I need to write a ACL (Access control list) for giving read access on records to current logged-in-user Conditions: 1. if He is manager of any group & Having ITIL role "AND" 2. He can see record only if resolved by is a group member where he is the manager. For Example: If I am current logged in user and I am the manager of X group & having ITIL role, So I can see which record only they resolved by X group's member.
ee1f4288a6b28a9cd3b3e96e7f644a74
{ "intermediate": 0.4458940327167511, "beginner": 0.23263037204742432, "expert": 0.321475625038147 }
45,206
diffrence between #if and #ifdef
635f366e7393325b6542e10034d22cc5
{ "intermediate": 0.2996605932712555, "beginner": 0.457660973072052, "expert": 0.2426784634590149 }
45,207
if i dont know the target name how to access this via variable
5f2bac6877b78e92bf44e04cc5b23659
{ "intermediate": 0.3528299331665039, "beginner": 0.3797396421432495, "expert": 0.26743045449256897 }
45,208
if i dont know the target name how to access this via variable in cmakelists for setting target_compile_definitions
1abb627281ea00fc1bb6cf0bf278d90c
{ "intermediate": 0.4006616175174713, "beginner": 0.31176185607910156, "expert": 0.28757649660110474 }
45,209
Show example how to use threading in python
9898b5effb5d6f5fb0e41e6659e1aecc
{ "intermediate": 0.4534110426902771, "beginner": 0.17324887216091156, "expert": 0.3733401298522949 }
45,210
i have this react function. const isChecked = (id: number): boolean => { // isOptionCheckedEv({filter: selectedFilter, id}); if (selectedFilters[selectedFilter]) { console.log(selectedFilters[selectedFilter].some(option => option.id === id)); selectedFilters[selectedFilter].some(option => option.id === id) } }; isChecked should return a boolean value. I want it to return true if my .some method would return true and false otherwise. How can I do it? Keep in mind that I need it all to be Typescript.
30e55df35a25039ee7d86f342f0adca2
{ "intermediate": 0.387687087059021, "beginner": 0.4828522503376007, "expert": 0.1294606477022171 }
45,211
can write django simple async. from zero to create async view?
81c3800170be608f97ab73c69d44fe25
{ "intermediate": 0.6460711359977722, "beginner": 0.19881705939769745, "expert": 0.15511175990104675 }
45,212
from tensorflow.keras.preprocessing.sequence import TimeseriesGenerator # Define sequence length for the time series data sequence_length = 10 # Adjust as needed based on the length of sequences used during training # Example unseen date data in the format (Year, Month, Day) unseen_dates = np.array([[2023, 10, 11], [2023, 12, 12], [2023, 12, 13], [2023, 11, 14], [2023, 12, 15], [2023, 12, 16]]) # Create TimeseriesGenerator for unseen date data unseen_generator = TimeseriesGenerator(data=unseen_dates, targets=unseen_dates, # Use the date data itself as targets length=sequence_length, batch_size=1) # Make predictions on the unseen date data predictions = model.predict(unseen_generator) # Print the predictions print(predictions)
4b6e39911370d810f79f970341bffc6e
{ "intermediate": 0.4302140176296234, "beginner": 0.30385956168174744, "expert": 0.26592645049095154 }
45,213
in excel, how we can apply " Kondratieff Wave " for my data , my data is a1: a100, please make example.
b6fd1e9b86b69e33b3b750ce00f529b1
{ "intermediate": 0.38763463497161865, "beginner": 0.15839911997318268, "expert": 0.45396625995635986 }
45,214
This is the SQL query: SELECT e.first_name, e.last_name, e.job_id, d.department_name FROM employees e JOIN departments d ON (e.department_id = d.department_id) JOIN locations l ON (d.location_id = l.location_id) WHERE LOWER(l.city) = 'Toronto';
f1e2595c7bb30c0a0666d0a989db506e
{ "intermediate": 0.43617552518844604, "beginner": 0.2921947240829468, "expert": 0.2716297507286072 }
45,215
import numpy as np from pandas import * from sklearn.ensemble import RandomForestClassifier from sklearn.model_selection import train_test_split from sklearn import metrics from scipy.io import arff def read_arff(f): data, meta = arff.loadarff(f) return DataFrame(data) def kfold(clr,X,y,folds=10): auc_sum=0 kf = train_test_split(y, folds) for train_index, test_index in kf: X_train, X_test = X[train_index], X[test_index] y_train, y_test = y[train_index], y[test_index] clr.fit(X_train, y_train) pred_test = clr.predict(X_test) print metrics.auc_score(y_test,pred_test) auc_sum+=metrics.auc_score(y_test,pred_test) print 'AUC: ', auc_sum/folds print "----------------------------" #read the dataset X=read_arff('phpMawTba.arff') y=X['Defective'] #changes N, and Y to 0, and 1 respectively s = np.unique(y) mapping = Series([x[0] for x in enumerate(s)], index = s) y=y.map(mapping) del X['Defective'] #initialize random forests (by defualt it is set to 10 trees) rf=RandomForestClassifier() #run algorithm kfold(rf,np.array(X),y)
b939f228dab28792d98b3ab3f9b9b4cf
{ "intermediate": 0.37929025292396545, "beginner": 0.21187113225460052, "expert": 0.4088385999202728 }
45,216
def read_arff(f): data, meta = arff.loadarff(f) return pd.DataFrame(data) def kfold(clr, X, y, folds=10): auc_sum = 0 kf = KFold(n_splits=folds) # Use KFold for generating indices for train_index, test_index in kf.split(X): # Use kf.split(X) to get indices X_train, X_test = X[train_index], X[test_index] y_train, y_test = y[train_index], y[test_index] clr.fit(X_train, y_train) pred_test_proba = clr.predict_proba(X_test)[:, 1] # Use predict_proba for AUC computation auc_score = roc_auc_score(y_test, pred_test_proba) # Use roc_auc_score for AUC print("AUC score:", auc_score) auc_sum += auc_score print("Average AUC:", auc_sum / folds) print("----------------------------") # Read the dataset X = read_arff("phpMawTba.arff") y = X["education"] # Change N, and Y to 0, and 1 respectively s = np.unique(y) mapping = pd.Series({v: k for k, v in enumerate(s)}) y = y.map(mapping).astype(int) # Ensure y is of integer type del X["education"] # Initialize random forests (by default, it is set to 100 trees as of newer sklearn versions) rf = RandomForestClassifier() # Convert X to numpy array as scikit-learn models expect numpy arrays X_np = np.array(X) # Run algorithm kfold(rf, X_np, y)
ec5c16cf4d7aedd5e385d09cf69937d8
{ "intermediate": 0.31932133436203003, "beginner": 0.16803798079490662, "expert": 0.512640655040741 }
45,217
Understand the Foundations of Decision Trees
ec96f84cb69dff801ad0eeb99e9f0089
{ "intermediate": 0.22950994968414307, "beginner": 0.25696179270744324, "expert": 0.5135282874107361 }
45,218
In servicenow, I have a requirement to auto-close the resolved cases after 5 business days. Currently, we are using a flow to auto-close the resolved case after 5 calendar days. I have added a condition to check if the resolved date is relative on/before 5 days ago. Now I have to add a schedule to this flow and I am out of ideas. Appreciate any suggestion and help.
a409e551aefe7c9dba0bd04b7349ae42
{ "intermediate": 0.4629896283149719, "beginner": 0.28737038373947144, "expert": 0.24963998794555664 }
45,219
# Tokenize and pad the text data tokenizer = Tokenizer() tokenizer.fit_on_texts(df['Event']) sequences = tokenizer.texts_to_sequences(df['Event']) max_length = max([len(seq) for seq in sequences]) sequences_padded = pad_sequences(sequences, maxlen=max_length, padding='post') # Encode the dates encoder = LabelEncoder() df['Date'] = encoder.fit_transform(df['Date']) # Assuming 'Date' is the name of your column with date information df['Date'] = pd.to_datetime(df['Date']) # Assuming `df['Date']` is in datetime format df['Year'] = df['Date'].dt.year df['Month'] = df['Date'].dt.month df['Day'] = df['Date'].dt.day # Combine these into a NumPy array or DataFrame X_dates = df[['Year', 'Month', 'Day']].values from sklearn.preprocessing import OneHotEncoder encoder = OneHotEncoder(sparse=False) y_encoded = encoder.fit_transform(df[['Event']]) from sklearn.model_selection import train_test_split # Split your data accordingly X_text_train, X_text_test, X_date_train, X_date_test, y_train, y_test = train_test_split( sequences_padded, X_dates, y_encoded, test_size=0.2, random_state=42) from tensorflow.keras.layers import Input, Embedding, LSTM, Dense, Concatenate from tensorflow.keras.models import Model # Assuming vocab_size, max_length, and num_classes are correctly defined vocab_size = len(tokenizer.word_index) + 1 # Add 1 for padding token max_length = max([len(x) for x in sequences]) # Maximum sequence length num_classes = y_encoded.shape[1] # Number of unique classes # Text Input Model text_input = Input(shape=(max_length,), dtype='int32', name='text_input') text_embedding = Embedding(input_dim=vocab_size, output_dim=50, name='text_embedding')(text_input) text_lstm = LSTM(50, name='text_lstm')(text_embedding) # Date Input Model date_input = Input(shape=(3,), name='date_input') # Assuming [Year, Month, Day] format date_dense = Dense(50, activation='relu', name='date_dense')(date_input) # Concatenate the outputs of the two models concatenated = Concatenate()([text_lstm, date_dense]) output_layer = Dense(num_classes, activation='softmax')(concatenated) # Combined Model combined_model = Model(inputs=[text_input, date_input], outputs=output_layer) ### Step 2: Compile the Model combined_model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) import numpy as np # Example `y_train` array # y_train = np.array([...]) # Calculate the occurrences of each class class_weights = {} labels, counts = np.unique(y_train, return_counts=True) # Calculate the weight for each class for label, count in zip(labels, counts): class_weights[label] = len(y_train) / (len(labels) * count) print(class_weights) from tensorflow.keras.layers import Input, Embedding, LSTM, Dense, Concatenate from tensorflow.keras.models import Model # Assuming vocab_size, max_length, and num_classes are correctly defined vocab_size = len(tokenizer.word_index) + 1 # Add 1 for padding token max_length = max([len(x) for x in sequences]) # Maximum sequence length num_classes = y_encoded.shape[1] # Number of unique classes # Date Input Model date_input = Input(shape=(3,), name='date_input') # Assuming [Year, Month, Day] format date_dense = Dense(50, activation='relu', name='date_dense')(date_input) # Text Output Model text_output = Input(shape=(max_length,), dtype='int32', name='text_output') text_embedding = Embedding(input_dim=vocab_size, output_dim=50, name='text_embedding')(text_output) text_lstm = LSTM(50, name='text_lstm')(text_embedding) # Concatenate the outputs of the two models concatenated = Concatenate()([date_dense, text_lstm]) output_layer = Dense(num_classes, activation='softmax')(concatenated) # Combined Model combined_model = Model(inputs=[date_input, text_output], outputs=output_layer) ### Step 2: Compile the Model combined_model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) ### Step 3: Train the Model # Ensure X_date_train, X_text_train, and y_train are correctly defined and prepared history = combined_model.fit( [X_date_train, X_text_train], y_train, epochs=200, # Adjust as necessary batch_size=32, # Adjust as necessary validation_split=0.2, class_weight=class_weights# Adjust as necessary ) convert code into time series analysis
b524a2c3e5992da763697941eeff0aa6
{ "intermediate": 0.2323099821805954, "beginner": 0.5689046382904053, "expert": 0.19878537952899933 }
45,220
I want you to remove everything before the # on each line including the #
65d10422026289537fd8d09f0c1e32c0
{ "intermediate": 0.2606082260608673, "beginner": 0.3717309534549713, "expert": 0.36766085028648376 }
45,221
use react js with json-server and Material-ui
70d25a31863047010b991c7fcd4f63b6
{ "intermediate": 0.6000931859016418, "beginner": 0.2625204920768738, "expert": 0.13738632202148438 }
45,222
in " Kondratieff Wave " have P.R.D.E. in excel, there are other formula can find below P.R.D.E. in my data ? P: prosperity R: recession D: depression E: improvement
71372164aa664835518f42c8716a57aa
{ "intermediate": 0.2965812385082245, "beginner": 0.3524410128593445, "expert": 0.3509777784347534 }
45,223
как здесь корректно сделать, что если в первом catch вылетело исключение Long getOperationId(ResponseResult3Ds responseResult3Ds) { String operationAsStr = responseResult3Ds.getExtId(); LOGGER.info("[getOperationId] operationAsStr = {}", operationAsStr); try { return Long.parseLong(operationAsStr); } catch (Exception e) { // Если это был возврат, то номер операции будет с постфиксом типа: // 1425R1 // В зависимости от количества возвратов ExtId в ответе будет принимать вид ExtIdR1, ExtIdR2, ExtIdR3 и т.д. int postfixPos = operationAsStr.indexOf("R"); operationAsStr = operationAsStr.substring(0, postfixPos); LOGGER.info("[getOperationId] operationAsStr (without postfix) = {}", operationAsStr); } try { return Long.parseLong(operationAsStr); } catch (Exception e) { operationAsStr = operationAsStr.substring(0, operationAsStr.length() - 3); return Long.parseLong(operationAsStr); } }
67401c8b92677adff6a740ce2958f522
{ "intermediate": 0.3318106234073639, "beginner": 0.40133824944496155, "expert": 0.26685115694999695 }
45,225
Hi
5bca00f436767788979f3c05799282de
{ "intermediate": 0.33010533452033997, "beginner": 0.26984941959381104, "expert": 0.400045245885849 }
45,226
hy, i need an vba code to do this : on Sheet2(Licente) when i select an row to check for an mark x inside cells from selected row and if value from c on selected row is DA , colour all cells with an x mark inside from that row and first cell from that row in green , if value from c cell for selected row is NU then colour is red , also count number of cells with an mark x inside and display the number of cells on a1 cell from Sheet2(Licente),also if i delete or put an mark x on all cells from selected row , update automaticaly colour , count and save the file, also search value from first cell onSheet2(Licente) for selected row in Sheet3(System) and if more value are finded on an single colums ,than cumulate that value to 1 and count finded values on Sheet3(System) with only one results for colums , than compare number rezulted from cumulated colums from Sheet3(System) with number from A1 cells from Sheet2(Licente) and if difference is found display msg box "difference was found "
dbaec88c4250a7b6651b8025168a1efb
{ "intermediate": 0.4639831483364105, "beginner": 0.186045840382576, "expert": 0.3499709963798523 }
45,227
const createdDate = !createdDate ? 'N/A' : new Date(createdDate * 1000).toLocaleDateString(language, { month: 'short', day: 'numeric', year: 'numeric', hour: '2-digit', minute: '2-digit' }) Ошибка при (createdDate * 1000) - The left-hand side of an arithmetic operation must be of type 'any', 'number', 'bigint' or an enum type.ts что можно сделать?
98dfc613bc474e14a4eae64e5f2d003a
{ "intermediate": 0.2804376184940338, "beginner": 0.49104663729667664, "expert": 0.22851572930812836 }
45,228
Dim prevSelectedRow As Integer Private Sub Worksheet_SelectionChange(ByVal Target As Range) Dim c As Range Dim ws As Worksheet Set ws = Me ' Check if any previous row selected and turn its 'X' cells white If prevSelectedRow > 0 Then For Each c In ws.Rows(prevSelectedRow).Cells If UCase(c.value) = "X" Then c.Interior.Color = RGB(255, 255, 255) ' White End If Next c End If ' Change the current 'X' cells to yellow and update the prevSelectedRow If Target.Cells.Count = ws.Columns.Count Then For Each c In Target.Rows.Cells If UCase(c.value) = "X" Then c.Interior.Color = RGB(255, 255, 0) ' Yellow End If Next c prevSelectedRow = Target.Row Else prevSelectedRow = 0 End If End Sub
8355db3f67d3a7cf255d2516b11a0405
{ "intermediate": 0.5072509050369263, "beginner": 0.2866229712963104, "expert": 0.2061261534690857 }
45,229
How to launch a browser in selenium? give me short answer
70afc45bf4bdbf645f5e20c73de969ef
{ "intermediate": 0.21835842728614807, "beginner": 0.30346599221229553, "expert": 0.4781756103038788 }
45,230
how to write unit tests for static functions
26f3006488ee5d3193e92d7ed9ee4cad
{ "intermediate": 0.2798228859901428, "beginner": 0.5801144242286682, "expert": 0.14006273448467255 }
45,231
why is second thread not working? def add(thread): path = r"C:\Users\Plathera\Desktop\4224.json" index = 0 while True: try: print(f"\nthread: {thread}") index+=1 with open (path, encoding='utf-8') as file: data = json.load(file) data.append(index) with open (path, 'w', encoding='utf-8') as file: json.dump(data, file) time.sleep(0.1) print(data) except: Print('ERROR', 'red') # print(Load_Json(path)) # Create threads for each function t1 = threading.Thread(target=add(1)) t2 = threading.Thread(target=add(2))
9983cb5fcc3dc4edc3cdb755bbb44ab8
{ "intermediate": 0.30187252163887024, "beginner": 0.5625661611557007, "expert": 0.13556130230426788 }
45,232
convert three dimentional array to two dimentional array in c
153527b48dbc7fdc69a240770bb6c63d
{ "intermediate": 0.37864693999290466, "beginner": 0.209182009100914, "expert": 0.41217100620269775 }
45,233
Dim prevSelectedRow As Integer Private Sub Worksheet_SelectionChange(ByVal Target As Range) Dim c As Range Dim ws As Worksheet, licenteSheet As Worksheet Set ws = Me Set licenteSheet = ThisWorkbook.Sheets("Licente") Dim countX As Long ' Variable to count cells containing "X" countX = 0 ' Initialize counter to zero for each new selection ' Assuming column C in "Licente" sheet holds "DA" or "NU" for the corresponding rows in this sheet Dim decision As String If Target.row <= licenteSheet.Cells(Rows.Count, "C").End(xlUp).row Then decision = licenteSheet.Cells(Target.row, "C").value Else decision = "" End If ' Handle cell coloring based on 'DA' or 'NU' Dim colorForXCells As Long Dim colorForFirstCell As Long If decision = "DA" Then colorForXCells = RGB(0, 255, 0) ' Green colorForFirstCell = RGB(0, 255, 0) ElseIf decision = "NU" Then colorForXCells = RGB(255, 0, 0) ' Red colorForFirstCell = RGB(255, 0, 0) Else colorForXCells = RGB(255, 255, 255) ' White, or another default color colorForFirstCell = RGB(255, 255, 255) End If If Target.Cells.Count = ws.Columns.Count Then For Each c In Target.Rows.Cells If UCase(c.value) = "X" Then c.Interior.Color = colorForXCells countX = countX + 1 ' Increment the counter for each "X" found Else c.Interior.Color = RGB(255, 255, 255) ' Reset non-X cells to white End If Next c ' Color the first cell in the selected row according to DA or NU decision Target.Cells(1, 1).Interior.Color = colorForFirstCell prevSelectedRow = Target.row End If ' Update cell A1 with the count of 'X' cells found in the selected row ws.Range("A1").value = countX End Sub for this code i want that the colour of c cell on the selected row to change his colour acordig to value DA or NU .
5caccbace5e7163c5a7a8afd0e2421a7
{ "intermediate": 0.39677831530570984, "beginner": 0.3759459853172302, "expert": 0.22727566957473755 }
45,234
import os import logging import discord from langchain_openai import OpenAI from langchain_generativeai import GenerativeAI from langchain.chains import LLMChain from langchain_core.prompts import PromptTemplate from langchain.schema import StrOutputParser from langchain.memory import ConversationBufferWindowMemory # Configure our logger logging.basicConfig(level=logging.INFO) # Set our Discord and LangChain tokens DISCORD_TOKEN = os.environ['DISCORD_TOKEN'] GOOGLE_API_KEY = os.environ['GOOGLE_API_KEY'] LANGCHAIN_API_KEY = os.environ['LANGCHAIN_API_KEY'] LANGCHAIN_PROJECT = os.environ['LANGCHAIN_PROJECT'] LANGCHAIN_TRACING_V2 = os.environ['LANGCHAIN_TRACING_V2'] # Claim intent and instantiate Discord client intents = discord.Intents.default() intents.message_content = True intents.members = True client = discord.Client(intents=intents) # Prompt template for passing in the `user_prompt` after extracting everything after '$hello' in a user's message. # Therefore, the bot won't respond unless that keyword is at the beginning, and the context sent to the LLM won't include '$hello' prompt_template = PromptTemplate.from_template(template=""" [TASK] Answer the user prompt by writing a message-response based on the provided context, but never reference the context when writing a user message. [USER_PROMPT_START] {user_prompt} [/END] The following is additional context that you *may never* reference directly, it is merely your knowledge for contextual understanding. [NON_REFERENCEABLE_CONTEXT] <context> "Chugma" is an inside joke for the "Chump Squad," a group of online gamer friends. Whenever asked about "Chugma," you must verbally refer the user to ask for clarification from 'Marcuchan', one of the "Chump Squad" members. </context> [/END] """ ) # Initialize an OpenAI LLM for generating a response to a user prompt llm = LLMChain(llm=OpenAI(openai_api_key=OPENAI_API_KEY), prompt=prompt_template, output_parser=StrOutputParser(), memory=ConversationBufferWindowMemory(k=4)) # On ready, print a message to the console @client.event async def on_ready(): logging.info(f'We have logged in as {client.user}') # Define an event function to handle messages # First, see if messsage begins with '$hello' # Then, strip the rest of the message, and pass into `user_prompt` # Finally, Use an llm for generating the response based on context and the user's message @client.event async def on_message(message): if message.author == client.user: return # if a user message starts with '$hello' the rest of the content will \ # be stripped and passed in as the `user_prompt` for the `prompt_template` if message.content.startswith('$hello'): logging.info("Message contains $hello: \n\n" + str(message.content)) # Extract the message text after '$hello' user_prompt = message.content[len('$hello'):].strip() logging.info("Stripped user's message content. \n\n" + str(user_prompt)) # Default message if no text is provided default_message = 'Hello! How can I assist you today?' if user_prompt: # Use LangChain's LLM to generate a response based on the user's prompt # Wait for the LLM to generate a response # Define the response call response = llm.invoke(user_prompt) await message.channel.send(response) logging.info("Response sent to Discord.") else: # Send default message await message.channel.send(default_message) logging.info("Sent default message.") try: client.run(DISCORD_TOKEN) logging.info("Client ran.") except Exception as err: raise err logging.error(err) 将openai 替换为 google 的 generativeai
fa4c949dcb6221d3ca38457e33438259
{ "intermediate": 0.49367624521255493, "beginner": 0.2753772437572479, "expert": 0.23094648122787476 }
45,235
optimize in single line for resume skills AutoCAD: Proficient in both 2D and 3D drawing techniques, with the ability to create detailed architectural and engineering designs efficiently. SolidWorks: Advanced skills in part modeling, assembly creation, sheet metal design, and weldments. Experienced in developing complex designs and preparing them for manufacturing processes. Cura: Skilled in generating 3D printing codes from model designs, optimizing print settings for quality and material efficiency. CHITUBOX: Expertise in generating slicing algorithms for 3D models, ensuring optimal layering and printing success rates. Arduino: Competent in programming microcontrollers for various applications, including sensor integration and automation projects. Microsoft Office: Proficient in Excel for data analysis and reporting, Word for document creation, and PowerPoint for impactful presentations.
247d6cc51bd13af8868a1cc8eddcd265
{ "intermediate": 0.338003009557724, "beginner": 0.2604450285434723, "expert": 0.4015519917011261 }
45,236
i want to build a time series model x=date y=categorical data write a code to time series forecasting
95011651295260bbd154820ee81bae9c
{ "intermediate": 0.3831688463687897, "beginner": 0.156435027718544, "expert": 0.46039608120918274 }
45,237
I have to make a circuit in tinkercad with an arduino. I have the following components: DC motor, relay spdt, a pushbutton and a 9v battery (the arduino has a breadboard) The objective is to make the button to spin the motor. Give me a step by step guide on how to complete this assignment I want a step by step guide on what to connect and where and the code
50631551ced83abe34834b0dd7aab947
{ "intermediate": 0.45112815499305725, "beginner": 0.2932598888874054, "expert": 0.2556118965148926 }
45,238
/* * This file is part of CounterStrikeSharp. * CounterStrikeSharp is free software: you can redistribute it and/or modify * it under the terms of the GNU General Public License as published by * the Free Software Foundation, either version 3 of the License, or * (at your option) any later version. * * CounterStrikeSharp is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * GNU General Public License for more details. * * You should have received a copy of the GNU General Public License * along with CounterStrikeSharp. If not, see <https://www.gnu.org/licenses/>. * */ #include "core/globals.h" #include "core/log.h" #include "scripting/autonative.h" #include "public/entity2/entitykeyvalues.h" #include "natives_keyvalues.h" namespace counterstrikesharp { std::vector<CEntityKeyValues*> managed_keyvalues; CEntityKeyValues* CreateEntityKeyValues(ScriptContext& scriptContext) { auto cEntityKeyValues = new CEntityKeyValues(); managed_keyvalues.push_back(cEntityKeyValues); CSSHARP_CORE_INFO("Entity Key Values: {0}", static_cast<void*>(cEntityKeyValues)); return cEntityKeyValues; } void SetKeyValue(ScriptContext& scriptContext) { auto handle = scriptContext.GetArgument<CEntityKeyValues*>(0); auto key = scriptContext.GetArgument<const char*>(1); auto type = scriptContext.GetArgument<KeyValuesValueType>(3); CSSHARP_CORE_INFO("entityId: {0}", key); CSSHARP_CORE_INFO("Handle: {0}", static_cast<void*>(handle)); switch (type) { case KeyValuesValueType::Integer: { auto value = scriptContext.GetArgument<int>(2); handle->SetInt(key, value); break; } case KeyValuesValueType::String: { auto value = scriptContext.GetArgument<const char*>(2); handle->SetString(key, value); CSSHARP_CORE_INFO("String: {0}, {1}", key, value); break; } case KeyValuesValueType::Long: { auto value = scriptContext.GetArgument<long>(2); handle->SetInt64(key, value); break; } case KeyValuesValueType::Float: { auto value = scriptContext.GetArgument<float>(2); handle->SetFloat(key, value); CSSHARP_CORE_INFO("Float: {0}, {1}", key, value); CSSHARP_CORE_INFO("Float: {0}", handle->GetFloat(key)); break; } case KeyValuesValueType::Color: { auto value = scriptContext.GetArgument<Color>(2); handle->SetColor(key, value); CSSHARP_CORE_INFO("Color: {0}, R - {1}, G - {2}, B - {3}", key, value.r(), value.g(), value.b()); break; } case KeyValuesValueType::UnsignedInteger: { auto value = scriptContext.GetArgument<uint32_t>(2); handle->SetUint(key, value); break; } case KeyValuesValueType::UnsignedLong: { auto value = scriptContext.GetArgument<uint64_t>(2); handle->SetUint64(key, value); break; } case KeyValuesValueType::Vector: { auto value = scriptContext.GetArgument<Vector>(2); handle->SetVector(key, value); break; } case KeyValuesValueType::QAngle: { auto value = scriptContext.GetArgument<QAngle>(2); handle->SetQAngle(key, value); break; } case KeyValuesValueType::Boolean: { auto value = scriptContext.GetArgument<bool>(2); handle->SetBool(key, value); break; } default: break; } } void GetKeyValue(ScriptContext& scriptContext) { auto handle = scriptContext.GetArgument<CEntityKeyValues*>(0); auto key = scriptContext.GetArgument<const char*>(1); auto type = scriptContext.GetArgument<KeyValuesValueType>(2); switch (type) { case KeyValuesValueType::Integer: { scriptContext.SetResult(handle->GetInt(key)); ; break; } case KeyValuesValueType::String: { scriptContext.SetResult(handle->GetString(key)); break; } case KeyValuesValueType::Long: { scriptContext.SetResult(handle->GetInt64(key)); break; } case KeyValuesValueType::Float: { scriptContext.SetResult(handle->GetFloat(key)); break; } case KeyValuesValueType::Color: { scriptContext.SetResult(handle->GetColor(key)); break; } case KeyValuesValueType::UnsignedInteger: { scriptContext.SetResult(handle->GetUint(key)); break; } case KeyValuesValueType::UnsignedLong: { scriptContext.SetResult(handle->GetUint64(key)); break; } case KeyValuesValueType::Vector: { scriptContext.SetResult(handle->GetVector(key)); break; } case KeyValuesValueType::QAngle: { scriptContext.SetResult(handle->GetQAngle(key)); break; } case KeyValuesValueType::Boolean: { scriptContext.SetResult(handle->GetBool(key)); break; } default: break; } } REGISTER_NATIVES(key_values, { ScriptEngine::RegisterNativeHandler("CREATE_ENTITY_KEY_VALUES", CreateEntityKeyValues); ScriptEngine::RegisterNativeHandler("SET_KEY_VALUE", SetKeyValue); ScriptEngine::RegisterNativeHandler("GET_KEY_VALUE", GetKeyValue); }) } // namespace counterstrikesharp как можно убрать копирования свитч?
9895b88a199c052e79a74a6a75d9d242
{ "intermediate": 0.32047900557518005, "beginner": 0.3500599265098572, "expert": 0.32946109771728516 }
45,239
Dim prevSelectedRow As Integer Private Sub Worksheet_SelectionChange(ByVal Target As Range) Dim c As Range Dim ws As Worksheet, licenteSheet As Worksheet Set ws = Me Set licenteSheet = ThisWorkbook.Sheets(“Licente”) ’ Unhide all rows first to reset any previous hiding. ws.Rows.Hidden = False Dim decision As String ’ Check if the row in “Licente” corresponding to the selected row has “DA” or “NU” If Target.Row <= licenteSheet.Cells(licenteSheet.Rows.Count, “C”).End(xlUp).Row Then decision = licenteSheet.Cells(Target.Row, “C”).Value Else decision = “” End If Dim colorForXCells As Long If decision = “DA” Then colorForXCells = RGB(0, 255, 0) ’ Green ElseIf decision = “NU” Then colorForXCells = RGB(255, 0, 0) ’ Red Else colorForXCells = RGB(255, 255, 255) ’ White or another default color End If Dim countX As Long ’ Initialize the counter for “X” countX = 0 If Not Intersect(Target, ws.Rows(Target.Row)) Is Nothing Then ’ Ensure all rows are visible first. Optionally can be removed if you don’t want to unhide them upon each selection. ws.Rows.Hidden = False ’ Hide previous rows except the first one If Target.Row > 1 Then ws.Rows(“2:” & Target.Row - 1).Hidden = True End If ’ Reset all cells in the row to white first Target.EntireRow.Interior.Color = RGB(255, 255, 255) ’ Iterate only through cells within the target row For Each c In Target.Cells ’ Color only the cells with “X” as per the decision and increment countX If UCase(c.Value) = “X” Then c.Interior.Color = colorForXCells countX = countX + 1 ’ Increment the counter for each “X” found End If Next c ’ Color the first cell and cell in column C of the selected row as per the decision Target.Cells(1, 1).Interior.Color = colorForXCells ’ First cell in the row ws.Cells(Target.Row, 3).Interior.Color = colorForXCells ’ Cell in column C prevSelectedRow = Target.Row End If ’ Update cell A1 with the count of ‘X’ cells found in the selected row ws.Range(“A1”).Value = countX End Sub for this code i want to automaticaly unhide all rows from Sheet2(Licente) when unselect the row
b34515584685abea3ccd4c57dc5785ea
{ "intermediate": 0.32259872555732727, "beginner": 0.3364711105823517, "expert": 0.34093019366264343 }
45,240
How to parse json libre office vba
62f7b0457d2ac689572630ffe7204ebe
{ "intermediate": 0.6123250722885132, "beginner": 0.2612713873386383, "expert": 0.1264035999774933 }
45,241
i have a push button connected to pin 2 on my arduino and my dc motor connected through a relay to pin 13 write me some c++ code to make sure that when i hold it down it stops the motor from spinning
e1033fc3bbdedee650a5a2904c6b6268
{ "intermediate": 0.47456762194633484, "beginner": 0.30563342571258545, "expert": 0.21979892253875732 }
45,242
im making fillin middle dataset User Here is the code make necessary changes: import torch from torch.utils.data import DataLoader from datasets import Dataset, DatasetDict import pandas as pd import random from transformers import AutoModelForCausalLM, AutoTokenizer, DataCollatorForLanguageModeling model = "Qwen/Qwen1.5-1.8B" tokenizer = AutoTokenizer.from_pretrained(model) model = AutoModelForCausalLM.from_pretrained(model) tokenizer.pad_token = tokenizer.eos_token from datasets import load_dataset prefix_token = "<fim_prefix>" middle_token = "<fim_middle>" suffix_token = "<fim_suffix>" eot_token = "" context_length = 512 chunk_length = context_length max_len = context_length ds_train = load_dataset("huggingface-course/codeparrot-ds-train", split="train") ds_valid = load_dataset("huggingface-course/codeparrot-ds-valid", split="validation") raw_datasets = DatasetDict( { "train": ds_train.shuffle().select(range(50000)), "valid": ds_valid.shuffle().select(range(500)), } ) def split_document(doc, fim_rate=0.5): if random.random() < fim_rate: length = len(doc) prefix_len = random.randint(0,length) suffix_len = random.randint(0, length - prefix_len) middle_len = length - prefix_len - suffix_len prefix = doc[:prefix_len] middle = doc[prefix_len:prefix_len+middle_len] suffix = doc[prefix_len+middle_len:] return prefix, middle, suffix else: return doc, None, None def format_psm(prefix, middle, suffix, tokenizer): fromatted_examle = f"{prefix_token}{prefix}{suffix_token}{suffix}{middle_token}{middle}{eot_token}" return formatted_examle def format_spm(prefix, middle, suffix, tokenizer): fromatted_examle = f"{prefix_token}{suffix_token}{suffix}{middle_token}{prefix}{middle}{eot_token}" return formatted_examle def format_nofim(doc, tokenizer): formatted_examle = f"{doc}" return formatted_examle def apply_fim_transformation(chunk_list, p_psm=0.5): transformed_docs = [] for chunk in chunk_list: prefix, middle, suffix = split_document(chunk) if middle is not None: if random.random() < p_psm: transformed_doc = format_psm(prefix, middle, suffix, tokenizer) else: transformed_doc = format_spm(prefix, middle, suffix, tokenizer) transformed_docs.append(transformed_doc) else: transformed_doc = format_nofim(chunk, tokenizer) transformed_docs.append(transformed_doc) return transformed_docs def join_transformed_chunk_docs(transformed_chunk_docs): merged_docs = transformed_chunk_docs[0] if len(transformed_chunk_docs) == 1: return merged_docs for i in range(2, len(transformed_chunk_docs)): merged_docs = merged_docs + eot_token + transformed_chunk_docs[i] return merged_docs def apply_context_level_fim(chunk): chunk_docs = chunk.split(eot_token) transformed_chunk_docs = apply_fim_transformation(chunk_docs) joined_transformed_chunk_docs = join_transformed_chunk_docs(transformed_chunk_docs) # Return the transformed text directly return joined_transformed_chunk_docs class FimDataset(Dataset): def __init__(self, data): self._data = data def __len__(self): return len(self._data) def __repr__(self): return f"Dataset(num_rows={len(self)})" def __getitem__(self, index): chunk = self._data[index]["content_chunk"] chunk = str(chunk) return apply_context_level_fim(chunk) train_dataset = FimDataset(chunk_ds["train"]) valid_dataset = FimDataset(chunk_ds["valid"])
4cd0bcef522535242ce50e3651a017f6
{ "intermediate": 0.35619619488716125, "beginner": 0.33965614438056946, "expert": 0.3041476309299469 }
45,243
Hi Please be a senior sapui5 developer and answer myquestion with working code examples.
2f10f12e3729e00b52215b0336542733
{ "intermediate": 0.40168747305870056, "beginner": 0.29459062218666077, "expert": 0.30372190475463867 }
45,244
i have 6 threads in my python script, can i split my console screen in 6 parts each printing what is being executed on each thread?
48e6ed86ff9164846ffdfa4cbdfc8274
{ "intermediate": 0.5385343432426453, "beginner": 0.1840106099843979, "expert": 0.27745506167411804 }
45,245
Make this Python code for building the mtl model look more elegant: "def build_model(input_shape, num_classes): num_filter = 16 # Encoder inputs = Input(input_shape) conv1 = Conv2D(num_filter * 1, 3, activation="linear", padding="same", strides=1)(inputs) bn1 = BatchNormalization()(conv1) relu1 = Activation("relu")(bn1) conv2 = Conv2D(num_filter * 1, 3, activation="linear", padding="same", strides=1)(relu1) bn2 = BatchNormalization()(conv2) relu2 = Activation("relu")(bn2) down1 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu2) conv3 = Conv2D(num_filter * 2, 3, activation="linear", padding="same", strides=1)(down1) bn3 = BatchNormalization()(conv3) relu3 = Activation("relu")(bn3) conv4 = Conv2D(num_filter * 2, 3, activation="linear", padding="same", strides=1)(relu3) bn4 = BatchNormalization()(conv4) relu4 = Activation("relu")(bn4) down2 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu4) conv5 = Conv2D(num_filter * 4, 3, activation="linear", padding="same", strides=1)(down2) bn5 = BatchNormalization()(conv5) relu5 = Activation("relu")(bn5) conv6 = Conv2D(num_filter * 4, 3, activation="linear", padding="same", strides=1)(relu5) bn6 = BatchNormalization()(conv6) relu6 = Activation("relu")(bn6) down3 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu6) conv7 = Conv2D(num_filter * 8, 3, activation="linear", padding="same", strides=1)(down3) bn7 = BatchNormalization()(conv7) relu7 = Activation("relu")(bn7) conv8 = Conv2D(num_filter * 8, 3, activation="linear", padding="same", strides=1)(relu7) bn8 = BatchNormalization()(conv8) relu8 = Activation("relu")(bn8) # Middle down4 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu8) conv9 = Conv2D(num_filter * 16, 3, activation="linear", padding="same", strides=1)(down4) bn9 = BatchNormalization()(conv9) relu9 = Activation("relu")(bn9) conv10 = Conv2D(num_filter * 16, 3, activation="linear", padding="same", strides=1)(relu9) bn10 = BatchNormalization()(conv10) relu10 = Activation("relu")(bn10) up1 = UpSampling2D(size=(2, 2), interpolation="bilinear")(relu10) # Decoder (Done) concat1 = concatenate([up1, relu8], axis=-1) conv11 = Conv2D(num_filter * 8, 3, activation="linear", padding="same", strides=1)(concat1) bn11 = BatchNormalization()(conv11) relu11 = Activation("relu")(bn11) conv12 = Conv2D(num_filter * 8, 3, activation="linear", padding="same", strides=1)(relu11) bn12 = BatchNormalization()(conv12) relu12 = Activation("relu")(bn12) up2 = UpSampling2D(size=(2, 2), interpolation="bilinear")(relu12) concat2 = concatenate([up2, relu6], axis=-1) conv13 = Conv2D(num_filter * 4, 3, activation="linear", padding="same", strides=1)(concat2) bn13 = BatchNormalization()(conv13) relu13 = Activation("relu")(bn13) conv14 = Conv2D(num_filter * 4, 3, activation="linear", padding="same", strides=1)(relu13) bn14 = BatchNormalization()(conv14) relu14 = Activation("relu")(bn14) up3 = UpSampling2D(size=(2, 2), interpolation="bilinear")(relu14) concat3 = concatenate([up3, relu4], axis=-1) conv15 = Conv2D(num_filter * 2, 3, activation="linear", padding="same", strides=1)(concat3) bn15 = BatchNormalization()(conv15) relu15 = Activation("relu")(bn15) conv16 = Conv2D(num_filter * 2, 3, activation="linear", padding="same", strides=1)(relu15) bn16 = BatchNormalization()(conv16) relu16 = Activation("relu")(bn16) up4 = UpSampling2D(size=(2, 2), interpolation="bilinear")(relu16) concat4 = concatenate([up4, relu2], axis=-1) conv17 = Conv2D(num_filter * 1, 3, activation="linear", padding="same", strides=1)(concat4) bn17 = BatchNormalization()(conv17) relu17 = Activation("relu")(bn17) conv18 = Conv2D(num_filter * 1, 3, activation="linear", padding="same", strides=1)(relu17) bn18 = BatchNormalization()(conv18) relu18 = Activation("relu")(bn18) # Segmentation branch segmentation_output = Conv2D(1, 1, activation="sigmoid", name="segmentation_output")(relu18) # Classification branch gap1 = GlobalAveragePooling2D()(relu8) gap2 = GlobalAveragePooling2D()(relu10) gap3 = GlobalAveragePooling2D()(relu12) conv20 = Conv2D(num_filter * 4, 3, activation="linear", padding="same", strides=1)(segmentation_output) bn20 = BatchNormalization()(conv20) relu20 = Activation("relu")(bn20) down5 = MaxPooling2D(pool_size=(4, 4), strides=4)(relu20) conv21 = Conv2D(num_filter * 4, 3, activation="linear", padding="same", strides=1)(down5) bn21 = BatchNormalization()(conv21) relu21 = Activation("relu")(bn21) down6 = MaxPooling2D(pool_size=(4, 4), strides=4)(relu21) conv22 = Conv2D(num_filter * 1, 3, activation="linear", padding="same", strides=1)(down6) bn22 = BatchNormalization()(conv22) relu22 = Activation("relu")(bn22) down7 = MaxPooling2D(pool_size=(4, 4), strides=4)(relu22) flatten1 = Flatten()(down7) concat5 = concatenate([gap1, gap2, gap3, flatten1], axis=-1) # FC layers fc1 = Dense(1024, activation="relu")(concat5) dropout1 = Dropout(0.5)(fc1) fc2 = Dense(1024, activation="relu")(dropout1) dropout2 = Dropout(0.5)(fc2) classification_output = Dense(num_classes, activation="softmax", name="classification_output")(dropout2) # Define the model model = Model(inputs=inputs, outputs=[classification_output, segmentation_output]) return model"
a06bdcf4a7f98dc429e306d46f6faf32
{ "intermediate": 0.3101017475128174, "beginner": 0.32790154218673706, "expert": 0.36199671030044556 }
45,246
Make this Python code for building the mtl model shorter: “def build_model(input_shape, num_classes): num_filter = 16 # Encoder inputs = Input(input_shape) conv1 = Conv2D(num_filter * 1, 3, activation=“linear”, padding=“same”, strides=1)(inputs) bn1 = BatchNormalization()(conv1) relu1 = Activation(“relu”)(bn1) conv2 = Conv2D(num_filter * 1, 3, activation=“linear”, padding=“same”, strides=1)(relu1) bn2 = BatchNormalization()(conv2) relu2 = Activation(“relu”)(bn2) down1 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu2) conv3 = Conv2D(num_filter * 2, 3, activation=“linear”, padding=“same”, strides=1)(down1) bn3 = BatchNormalization()(conv3) relu3 = Activation(“relu”)(bn3) conv4 = Conv2D(num_filter * 2, 3, activation=“linear”, padding=“same”, strides=1)(relu3) bn4 = BatchNormalization()(conv4) relu4 = Activation(“relu”)(bn4) down2 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu4) conv5 = Conv2D(num_filter * 4, 3, activation=“linear”, padding=“same”, strides=1)(down2) bn5 = BatchNormalization()(conv5) relu5 = Activation(“relu”)(bn5) conv6 = Conv2D(num_filter * 4, 3, activation=“linear”, padding=“same”, strides=1)(relu5) bn6 = BatchNormalization()(conv6) relu6 = Activation(“relu”)(bn6) down3 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu6) conv7 = Conv2D(num_filter * 8, 3, activation=“linear”, padding=“same”, strides=1)(down3) bn7 = BatchNormalization()(conv7) relu7 = Activation(“relu”)(bn7) conv8 = Conv2D(num_filter * 8, 3, activation=“linear”, padding=“same”, strides=1)(relu7) bn8 = BatchNormalization()(conv8) relu8 = Activation(“relu”)(bn8) # Middle down4 = MaxPooling2D(pool_size=(2, 2), strides=2)(relu8) conv9 = Conv2D(num_filter * 16, 3, activation=“linear”, padding=“same”, strides=1)(down4) bn9 = BatchNormalization()(conv9) relu9 = Activation(“relu”)(bn9) conv10 = Conv2D(num_filter * 16, 3, activation=“linear”, padding=“same”, strides=1)(relu9) bn10 = BatchNormalization()(conv10) relu10 = Activation(“relu”)(bn10) up1 = UpSampling2D(size=(2, 2), interpolation=“bilinear”)(relu10) # Decoder (Done) concat1 = concatenate([up1, relu8], axis=-1) conv11 = Conv2D(num_filter * 8, 3, activation=“linear”, padding=“same”, strides=1)(concat1) bn11 = BatchNormalization()(conv11) relu11 = Activation(“relu”)(bn11) conv12 = Conv2D(num_filter * 8, 3, activation=“linear”, padding=“same”, strides=1)(relu11) bn12 = BatchNormalization()(conv12) relu12 = Activation(“relu”)(bn12) up2 = UpSampling2D(size=(2, 2), interpolation=“bilinear”)(relu12) concat2 = concatenate([up2, relu6], axis=-1) conv13 = Conv2D(num_filter * 4, 3, activation=“linear”, padding=“same”, strides=1)(concat2) bn13 = BatchNormalization()(conv13) relu13 = Activation(“relu”)(bn13) conv14 = Conv2D(num_filter * 4, 3, activation=“linear”, padding=“same”, strides=1)(relu13) bn14 = BatchNormalization()(conv14) relu14 = Activation(“relu”)(bn14) up3 = UpSampling2D(size=(2, 2), interpolation=“bilinear”)(relu14) concat3 = concatenate([up3, relu4], axis=-1) conv15 = Conv2D(num_filter * 2, 3, activation=“linear”, padding=“same”, strides=1)(concat3) bn15 = BatchNormalization()(conv15) relu15 = Activation(“relu”)(bn15) conv16 = Conv2D(num_filter * 2, 3, activation=“linear”, padding=“same”, strides=1)(relu15) bn16 = BatchNormalization()(conv16) relu16 = Activation(“relu”)(bn16) up4 = UpSampling2D(size=(2, 2), interpolation=“bilinear”)(relu16) concat4 = concatenate([up4, relu2], axis=-1) conv17 = Conv2D(num_filter * 1, 3, activation=“linear”, padding=“same”, strides=1)(concat4) bn17 = BatchNormalization()(conv17) relu17 = Activation(“relu”)(bn17) conv18 = Conv2D(num_filter * 1, 3, activation=“linear”, padding=“same”, strides=1)(relu17) bn18 = BatchNormalization()(conv18) relu18 = Activation(“relu”)(bn18) # Segmentation branch segmentation_output = Conv2D(1, 1, activation=“sigmoid”, name=“segmentation_output”)(relu18) # Classification branch gap1 = GlobalAveragePooling2D()(relu8) gap2 = GlobalAveragePooling2D()(relu10) gap3 = GlobalAveragePooling2D()(relu12) conv20 = Conv2D(num_filter * 4, 3, activation=“linear”, padding=“same”, strides=1)(segmentation_output) bn20 = BatchNormalization()(conv20) relu20 = Activation(“relu”)(bn20) down5 = MaxPooling2D(pool_size=(4, 4), strides=4)(relu20) conv21 = Conv2D(num_filter * 4, 3, activation=“linear”, padding=“same”, strides=1)(down5) bn21 = BatchNormalization()(conv21) relu21 = Activation(“relu”)(bn21) down6 = MaxPooling2D(pool_size=(4, 4), strides=4)(relu21) conv22 = Conv2D(num_filter * 1, 3, activation=“linear”, padding=“same”, strides=1)(down6) bn22 = BatchNormalization()(conv22) relu22 = Activation(“relu”)(bn22) down7 = MaxPooling2D(pool_size=(4, 4), strides=4)(relu22) flatten1 = Flatten()(down7) concat5 = concatenate([gap1, gap2, gap3, flatten1], axis=-1) # FC layers fc1 = Dense(1024, activation=“relu”)(concat5) dropout1 = Dropout(0.5)(fc1) fc2 = Dense(1024, activation=“relu”)(dropout1) dropout2 = Dropout(0.5)(fc2) classification_output = Dense(num_classes, activation=“softmax”, name=“classification_output”)(dropout2) # Define the model model = Model(inputs=inputs, outputs=[classification_output, segmentation_output]) return model”
d005f734cb7ccb2b296729f0af0d2e50
{ "intermediate": 0.3038516938686371, "beginner": 0.38465937972068787, "expert": 0.31148892641067505 }
45,247
python code to read from a file and trigger multiple parallel process.
ddf935f3f3b74cd15645334450cca4c2
{ "intermediate": 0.524897038936615, "beginner": 0.13081134855747223, "expert": 0.344291627407074 }
45,248
Dim prevSelectedRow As Integer Private Sub Worksheet_SelectionChange(ByVal Target As Range) Dim c As Range Dim ws As Worksheet, licenteSheet As Worksheet Set ws = Me Set licenteSheet = ThisWorkbook.Sheets("Licente") Dim decision As String ' Check if the row in "Licente" corresponding to the selected row has "DA" or "NU" If Target.row <= licenteSheet.Cells(licenteSheet.Rows.Count, "C").End(xlUp).row Then decision = licenteSheet.Cells(Target.row, "C").value Else decision = "" End If Dim colorForXCells As Long If decision = "DA" Then colorForXCells = RGB(0, 255, 0) ' Green ElseIf decision = "NU" Then colorForXCells = RGB(255, 0, 0) ' Red Else colorForXCells = RGB(255, 255, 255) ' White or another default color End If Dim countX As Long ' Initialize the counter for "X" countX = 0 If Not Intersect(Target, ws.Rows(Target.row)) Is Nothing Then ' Reset all cells in the row to white first Target.EntireRow.Interior.Color = RGB(255, 255, 255) ' Iterate only through cells within the target row For Each c In Target.Cells ' Color only the cells with "X" as per the decision and increment countX If UCase(c.value) = "X" Then c.Interior.Color = colorForXCells countX = countX + 1 ' Increment the counter for each "X" found End If Next c ' Color the first cell and cell in column C of the selected row as per the decision Target.Cells(1, 1).Interior.Color = colorForXCells ' First cell in the row ws.Cells(Target.row, 3).Interior.Color = colorForXCells ' Cell in column C prevSelectedRow = Target.row End If ' Update cell A1 with the count of 'X' cells found in the selected row ws.Range("A1").value = countX End Sub to this code i want that empty cells on selected row to colored on blue on select row
1b6a89b928ad4ab85ddbacf4ea5262dc
{ "intermediate": 0.32791444659233093, "beginner": 0.4615945816040039, "expert": 0.21049100160598755 }
45,249
Dim prevSelectedRow As Integer Private Sub Worksheet_SelectionChange(ByVal Target As Range) Dim c As Range Dim ws As Worksheet, licenteSheet As Worksheet Set ws = Me Set licenteSheet = ThisWorkbook.Sheets("Licente") Dim decision As String ' Check if the row in "Licente" corresponding to the selected row has "DA" or "NU" If Target.row <= licenteSheet.Cells(licenteSheet.Rows.Count, "C").End(xlUp).row Then decision = licenteSheet.Cells(Target.row, "C").value Else decision = "" End If Dim colorForXCells As Long Dim colorForEmptyCells As Long colorForEmptyCells = RGB(0, 0, 255) ' Blue for empty cells If decision = "DA" Then colorForXCells = RGB(0, 255, 0) ' Green ElseIf decision = "NU" Then colorForXCells = RGB(255, 0, 0) ' Red Else colorForXCells = RGB(255, 255, 255) ' White or another default color End If Dim countX As Long ' Initialize the counter for "X" countX = 0 If Not Intersect(Target, ws.Rows(Target.row)) Is Nothing Then ' Reset all cells in the row to white first to remove previous formatting Target.EntireRow.Interior.Color = RGB(255, 255, 255) ' Iterate only through cells within the target row For Each c In Target.Cells If UCase(c.value) = "X" Then c.Interior.Color = colorForXCells countX = countX + 1 ' Increment the counter for each "X" found ElseIf c.value = "" Then c.Interior.Color = colorForEmptyCells ' Color empty cells blue End If Next c ' Color the first cell and cell in column C of the selected row as per the decision Target.Cells(1, 1).Interior.Color = colorForXCells ' First cell in the row ws.Cells(Target.row, 3).Interior.Color = colorForXCells ' Cell in column C prevSelectedRow = Target.row End If ' Update cell A1 with the count of 'X' cells found in the selected row ws.Range("A1").value = countX End Sub for this code change colour of empty cell to yellow and hide rows before selected row automaticaly
8186e4d15fe82243e2bb577b3d22e27d
{ "intermediate": 0.32021692395210266, "beginner": 0.39654263854026794, "expert": 0.28324049711227417 }
45,250
fewjative2 • 4h ago B is so cool because in hindsight it's obvious. We already do token prediction for LLMs and we know how good claude/chatGPT are. Upvote 1 Downvote Reply reply Share Share u/stonetriangles avatar stonetriangles • 3h ago ??? B is the obsolete thing they are beating Upvote 3 Downvote Reply reply Share Share explain
4870331be81f3340f897e290c52b842f
{ "intermediate": 0.35377630591392517, "beginner": 0.24201492965221405, "expert": 0.4042086899280548 }
45,251
#include "/home/danya/raylib/src/raylib.h" #include <vector> #include <cstdlib> #include "Province.h" Camera2D camera; void Province::addPoint(int x, int y) { Vector2 worldPosition = GetMousePosition(); points.push_back(worldPosition); } void Province::drawProvince() { if (points.size() > 0) { // Отрисовка полигона с заливкой DrawTriangleFan(&points[0], points.size(), fillColor); // Отрисовка красных кругов на точках полигона for (const auto& point : points) { DrawCircleV(point, 2.0f, RED); } } } Дополни функцию addPoint так, чтобы точки рисовались на координатах пикселей загруженного изображения, а не на координатах пикселей экрана, преобразованных в мировые координаты
372261d33e6abfda56b3810e4ac536f6
{ "intermediate": 0.30580127239227295, "beginner": 0.43777385354042053, "expert": 0.2564248740673065 }
45,252
ffmpeg alternative udp://@8554 to udp://192.168.2.1:5601
0b95ffb2080442d50d5fb7691ade5456
{ "intermediate": 0.3847121298313141, "beginner": 0.29578855633735657, "expert": 0.31949931383132935 }
45,253
######################################################################## # HELPER FUNCTIONS # ######################################################################## def add_sprite_controls(sprite): """ Adds the controls to the sprite """ def left_key(): sprite.move_left(20) stage.event_key("left", left_key) def right_key(): sprite.move_right(20) stage.event_key("right", right_key) def up_key(): sprite.move_up(20) stage.event_key("up", up_key) def down_key(): sprite.move_down(20) stage.event_key("down", down_key) def add_sprite_collision(sprite, coordinates): """ Adds the collision event to the sprite """ def collision(sprite, hit_sprite): my_var = hit_sprite.get_color() if my_var == "red": sprite.go_to(coordinates[0], coordinates[1]) if my_var == "green": sprite.hide() hit_sprite.set_color("gold") sprite.event_collision(collision) def create_maze(maze): """ Creates the maze using square sprites """ number_of_rows = len(maze) square_length = 500.0 / number_of_rows y = 250 - square_length/2 for row in maze: # add code here x = -250 + square_length/2 for column in row: # sprite = codesters.Square(x, y, width, "color") sprite = codesters.Square(x, y, square_length - 5, "white") if column == 0: sprite.set_color("red") elif column == 1: # add your code here start_coordinates = [x, y] elif column == 3: sprite.set_color("green") x += square_length y -= square_length return start_coordinates ######################################################################## # MAIN FUNCTION # ######################################################################## def main(): """ Sets up the program and calls other functions """ maze = [ [3, 2, 2, 2, 0], [0, 0, 0, 2, 0], [0, 0, 0, 2, 0], [1, 0, 2, 2, 0], [0, 0, 0, 0, 0], ] maze[3][1] = 2 start_coordinates = create_maze(maze) # sprite = codesters.Sprite("image", x, y) sprite = codesters.Sprite("codester", start_coordinates[0], start_coordinates[1]) add_sprite_controls(sprite) add_sprite_collision(sprite, start_coordinates) main() Now customize and extend your project! Make sure to meet these minimum technical requirements: Change where your sprite begins! (Remember, in your 2D list, the number 1 represents the maze's beginning.) Change the end of your maze! (Remember, in your 2D list, the number 3 represents the maze's end.) Add one more row and column to your maze to make it 6 x 6. You might have to change the sprite's size, too. When you're done, click Submit to turn in your work and continue.
3683634d879688239e83745cfe765796
{ "intermediate": 0.29752659797668457, "beginner": 0.4582313299179077, "expert": 0.24424205720424652 }
45,254
How to use Linux touch command to change the time and the date of all files in a directory?
4027a87a22040d69a8bd621158f26029
{ "intermediate": 0.5181117057800293, "beginner": 0.23175761103630066, "expert": 0.25013062357902527 }
45,255
what does this do? if not exist "%~1" mkdir "%~1" rem ~dp0 is current directory (where .bat exists),it will copy all from CCleaner folder to path specified as .bat parameter (parameter will be specified later on),and will copy shortcut to user Desktop copy /y "%~dp0CCleaner\*.*" "%~1" copy /y "%~dp0CCleaner\ccleaner.lnk" "%Public%/Desktop"
8b1a4e72fc864a924cc48cd27daf2d44
{ "intermediate": 0.4631832242012024, "beginner": 0.29789361357688904, "expert": 0.23892323672771454 }
45,256
The text elements that you can format with WALTER can be recognised by ending .label in the ‘UI Elements’ section of the sdk. For example, if we define the MCP label: set mcp.label [0 19 73 18 0 0 0 0] ... then that will place the background image, mcp_namebg.png, according to those values. The text that goes over it - your MCP track name - will sit inside an invisible text box with those same coordinates . But that’s unlikely to be ideal; you’ll probably want to pad the text away from the edges and justify it. what does the last line mean?
4510d9b20be157a5da7e92ae610e3f32
{ "intermediate": 0.4023663103580475, "beginner": 0.2758730351924896, "expert": 0.3217606544494629 }
45,257
copy /y "%~dp0source\*.*" "%~1" copies files - modify for it to remove files in dest folder
e555bb6eebe8e03d00a7ba5028eeff4f
{ "intermediate": 0.4055060148239136, "beginner": 0.23984567821025848, "expert": 0.35464829206466675 }
45,258
hi
0156943a01daf38901b7b87fd9e07e13
{ "intermediate": 0.3246487081050873, "beginner": 0.27135494351387024, "expert": 0.40399640798568726 }
45,259
Abstract Stock exchange is the ”mirror” of the economy and helps industry (and commerce) to accelerate the development of the country. The prices on the stock exchanges increase or decrease over the particular period and that rate represents stock market volatility. Higher stock price volatility is often associated with higher risk and indicates future fluctuations to investors in order to evaluate them. Predicting future stock price volatility can provide important information to market participants and enable them to make adequate decisions. The aim of this paper is to evaluate the stock price volatility of the Apple Company using the Monte Carlo simulation. Abstract This paper expands the available information on the effects of delisting in Russia, and represents a rare empirical analysis of the impact of external events on securities prices in this major global market. We seek to evaluate how stock prices of competing companies fluctuate around the dates of stock market delisting announcements and completion. We analyse stock prices as correlated with company delisting events from 2004 to 2019 on 552 companies on the Russian MOEX Exchange. The event study methodology is used to evaluate the abnormal returns of rival companies close to relevant delisting dates. These data were checked for statistical significance using the standardised Patell residual test. The results indicate a significant competitive effect on stock prices both on the dates of delisting announcement and on completion, with more significant returns close to announcement dates. These effects were found to influence the prospects not just of individual groups of companies, but of all market participants. We may conclude from our results that delisting is not an event limited in effect to only one company, but impacts the industry as a whole, temporarily changing its value. As such, it will interest both shareholders and managers of public companies, and any participants of industries in which delisting occurs. Abstract Using a difference-in-differences method, this study examines the effect of a competitor’s Chapter 11 bankruptcy on a firm’s risk-taking. The contingent nature of a competitor’s Chapter 11 bankruptcy, which protects the competitor from creditors’ demands during financial reorganization, may increase uncertainty in the industry. Consequently, the study tests the hypothesis that other firms in the industry respond to a competitor’s bankruptcy by decreasing risky investments in research and development (R&D), capital expenditures and acquisitions. To validate and extend this hypothesis, the study also hypothesizes that a firm’s strong financial standing—low leverage and good performance—and the firm’s diversification reduce the negative effect of the competitor’s bankruptcy on firm risk-taking. Findings from a study of US public firms suggest that, after controlling for industry conditions, firms indeed reduce their risk-taking when a competitor declares bankruptcy and that lower firm leverage, stronger firm performance, and greater firm diversification mitigate this effect. Together, these findings shed light on the literatures on bankruptcy and firm risk-taking. ABSTRACT People are overconfident. Overconfidence affects financial markets. How depends on who in the market is overconfident and on how information is distributed. This paper examines markets in which price-taking traders, a strategic-trading insider, and risk-averse marketmakers are overconfident. Overconfidence increases expected trading volume, increases market depth, and decreases the expected utility of overconfident traders. Its effect on volatility and price quality depend on who is overconfident. Overconfident traders can cause markets to underreact to the information of rational traders. Markets also underreact to abstract, statistical, and highly relevant information, and they overreact to salient, anecdotal, and less relevant information. a b s t r a c t We model debt restructurings that could endogenously end in bankruptcy, and study spillovers to competitors’ operating decisions, profits, restructuring outcomes and secu- rity prices. We show that while bankruptcy could cause the firm’s share price to drop, bankruptcy always signals good news about the firm. We identify the conditions under which a bankruptcy also signals good news about competitors. We demonstrate that when a firm’s bankruptcy costs are relatively small, bankruptcy raises its share price while low- ering the prices of competitors’ shares and debt as well as boosting the probability that they will enter bankruptcy. When there is little information asymmetry about the firm’s prospects, or the information asymmetry is about industry prospects, bankruptcy raises competitors’ share and debt prices and lowers their probability of bankruptcy. What's the pattern of writing an abstract for research paper? Consider the above examples
8e7149d67185a6d7c604c32c1e46d31e
{ "intermediate": 0.4115115702152252, "beginner": 0.3294927775859833, "expert": 0.2589956820011139 }
45,260
can you give me a sample test fixture usng gtest with the original function to check
f139b590100d5852ae5e0408b320b41a
{ "intermediate": 0.43233540654182434, "beginner": 0.22304268181324005, "expert": 0.34462183713912964 }
45,261
API construction with Laravel 11, ApiResponser response handling, response success, response fail, not found, etc.
7ae0c8e063182c0237d83feee31fb631
{ "intermediate": 0.9595219492912292, "beginner": 0.01831929013133049, "expert": 0.022158676758408546 }
45,262
<dict> <key>Track ID</key> <integer>2348</integer> <key>Name</key> <string>Pagal Iravai Maraigirai</string> <key>Artist</key> <string>Pranav Das</string> <key>Album Artist</key> <string>Pranav Das</string> <key>Composer</key> <string>Thava Kumar</string> <key>Album</key> <string>Pagal Iravai Maraigirai - Single</string> <key>Genre</key> <string>Vocal</string> <key>Kind</key> <string>Apple Music AAC audio file</string> <key>Size</key> <integer>10001758</integer> <key>Total Time</key> <integer>288167</integer> <key>Disc Number</key> <integer>1</integer> <key>Disc Count</key> <integer>1</integer> <key>Track Number</key> <integer>1</integer> <key>Track Count</key> <integer>1</integer> <key>Year</key> <integer>2021</integer> <key>Date Modified</key> <date>2024-04-03T18:30:29Z</date> <key>Date Added</key> <date>2024-04-03T18:30:29Z</date> <key>Bit Rate</key> <integer>256</integer> <key>Sample Rate</key> <integer>44100</integer> <key>Release Date</key> <date>2021-01-13T12:00:00Z</date> <key>Artwork Count</key> <integer>1</integer> <key>Sort Album</key> <string>Pagal Iravai Maraigirai - Single</string> <key>Sort Artist</key> <string>Pranav Das</string> <key>Sort Name</key> <string>Pagal Iravai Maraigirai</string> <key>Persistent ID</key> <string>3F5A243A5DD6EA8C</string> <key>Track Type</key> <string>Remote</string> <key>Apple Music</key> <true/> </dict> <key>2351</key> how to get apple music url for this song ?? can you give any program?
941e9d0aebd99f197c3bd2b3f8808b85
{ "intermediate": 0.39134353399276733, "beginner": 0.34566378593444824, "expert": 0.2629927098751068 }
45,263
One way to ensure consistent response formats in an API is to use a trait or base controller to handle: successes, errors, validation failures, etc. Custom ApiResponser to handle responses,
cf8babab1de6536389ea740498d9b8ec
{ "intermediate": 0.6562753319740295, "beginner": 0.14686505496501923, "expert": 0.19685959815979004 }
45,264
library(dplyr) # Assume ‘price_data’ is already loaded into your R environment # Ensure the ‘Date’ column is correctly referenced # Calculate log returns for each asset excluding the ‘Date’ column returns_data <- price_data %>% mutate(across(-Date, ~log(. / lag(.)))) %>% # Calculate log returns select(-Date) %>% # Now, exclude the ‘Date’ column from the returns_data na.omit() # Remove rows with NAs resulting from the lag operation # Compute summary statistics for the returns of each asset summary_stats <- returns_data %>% summarise(across(everything(), list( mean = ~mean(., na.rm = TRUE), median = ~median(., na.rm = TRUE), sd = ~sd(., na.rm = TRUE), min = ~min(., na.rm = TRUE), max = ~max(., na.rm = TRUE) ))) print(summary_stats) Can you use Function - basicStats() and Packagae - fPortfolio for the above code
aee41c981983470d68a501e3d9b6179e
{ "intermediate": 0.39452704787254333, "beginner": 0.44833236932754517, "expert": 0.15714050829410553 }
45,265
apply solid principles at: <?php namespace App\Http\Controllers; use Illuminate\Http\Request; use App\Traits\ApiResponser; class UserController extends Controller { use ApiResponser; public function index() { $users = User::all(); return $this->successResponse($users, 'Users retrieved successfully.'); } public function store(Request $request) { $validator = Validator::make($request->all(), [ 'name' => 'required|max:255', 'email' => 'required|email|unique:users', 'password' => 'required|min:6', ]); if ($validator->fails()) { return $this->validationErrorResponse($validator); } $user = User::create($request->all()); return $this->successResponse($user, 'User created successfully.', 201); } }
b69c0fd4f88085b30621beaf68cb5a7c
{ "intermediate": 0.4666631817817688, "beginner": 0.37313610315322876, "expert": 0.16020075976848602 }
45,266
Make a request using that operation send_post_message_to_url_advanced = 3033 #(send_post_message_to_url_advanced, <url_string>, <user_agent_string>, <post_data>, [<success_callback_script_no>], [<failure_callback_script_no>], [<skip_parsing>], [<timeout>]), #Sends a HTTP POST (application/x-www-form-urlencoded) request to <url_string> with <user_agent_string> and <post_data>. If the request succeeds, [<success_callback_script_no>] will be called. The script will behave like game_receive_url_response, unless [<skip_parsing>] is non-zero, in which case the script will receive no arguments and s0 will contain the full response. If the request fails, [<failure_callback_script_no>] will be called.
fa225601f95f4737e179f35190c4d3d4
{ "intermediate": 0.3531893193721771, "beginner": 0.33593839406967163, "expert": 0.31087228655815125 }
45,267
Case Begins: You have recently joined as a Portfolio Manager at Morgan Stanley. The first task assigned to you is to create a portfolio for a client who is interested in investing 1 million Euro in secondary markets. He wants the money to be “fully-invested”, but he is not aware of weight allocation in a scientific manner. Your employer has given you the responsibility to not only select the bunch of asset class for investment, but also allocate weight so as to garner more returns with limited risk. After analyzing the market trends, you are bullish in your approach and have narrowed down to the three asset classes for selection of Portfolio Universe. 1. Stocks - Google, Tesla, Pfizer, Shell, AT&T 2. Forex - USDINR, EURUSD, USDCAD, USDCHF, NZDUSD 3. Commodities - Crude, Natural Gas, Gold, Wheat, Ethanol Asset Tickr Google GOOGL Tesla TSLA Pfizer PFE Shell SHEL AT&T T USDINR USDINR EURUSD EURUSD USDCAD USDCAD USDCHF USDCHF NZDUSD NZDUSD Crude WTI Natural Gas NG Gold XAU Wheat W1 Ethanol EH Note: Portfolio constraints from your supervisor 1. Portfolio should consist of 5 assets 2. Atleast one from Commodity and one from Forex Historical Price Information To constitute Portfolio you will require the Price series for the selected datasets. For your rescue, the system portal provides an easy way to download the closing price series. You just have to pass your “employee id” to number generator and it will generate the monthly historic price series for you. Note: *Your employee id will be the student id at DCU ** Example - Student Name - Thomas Edison DCU Student No. - 1234567 emp id will be - 1234567 You have to pass on your DCU Student No. value to variable emp_id to generate the price series. Kindly run the code chunk below after passing your Student No. to generate historical price series {r} ####### enter your DCU Student id for the emp id variable emp_id <- 23260775 dig_count <- nchar(emp_id) - 3 rand_numb_init <- emp_id / (10 ^ dig_count) ################ Import date data #################### library(readxl) date_data <- read_excel(“Date.xlsx”) ############ Tickr symbols ##################### tickr_list <- c(“GOOGL”, “TSLA”,“PFE”,“SHELL”,“T”,“USDINR”,“EURUSD”,“USDCAD”,“USDCHF”,“NZDUSD”,“WTI”,“NG”,“XAU”,“W1”,“EH”) #### Historical Price series generation Portfolio Universe ############# set.seed(emp_id) raw_price_data <- matrix(,ncol = 1) multiplier <- seq(from = 3, to = 10, by = 0.5) for (i in 1:length(tickr_list)) { rand_price <- as.data.frame(runif(nrow(date_data), min=rand_numb_init, max= rand_numb_init * sample(multiplier,size = 1,replace = TRUE))) raw_price_data <- cbind(raw_price_data,rand_price) } raw_price_data <- raw_price_data[,-1] colnames(raw_price_data) <- tickr_list ######## Combining all the historical price along with Date ##3 price_data <- cbind(date_data,raw_price_data) writexl::write_xlsx(price_data,“Historical_Price_data.xlsx”) Annual risk-free rate Kindly run the code to generate the annual risk-free rate {r} annual_rf <- abs(rnorm(1,mean = 0, sd = 0.75))/1000 Price to Earning Ratio Kindly run the code to generate P/E ratio {r} ##### Price to Earning Ratio P_E_ratio <- runif(ncol(raw_price_data), min=rand_numb_init * 5, max= rand_numb_init * 5 * sample(multiplier,size = 1,replace = TRUE))/100 names(P_E_ratio) <- tickr_list , Case Detailing Ends , Rest of the questions have to be answered in context of the case study. , Question Begins , Portfolio Universe Selection Since, you have to make a Portfolio constituting of five assets only, you select the assets based on the basis of two strategies as shared below. Strategy 1: Step 1: Calculate the Reward to Risk - (Median Return/Standard Deviation) Step 2: Rank the Reward to Risk for Assets, then choose the top five asset while maintaining the constraint (Atleast one Commodity and one Forex) Note: In case of tie, you choose the asset with higher mean return. Strategy 2: Based on Price/Earning Ratio while maintaining the constraint (Atleast one Commodity and one Forex) Note: After filtering the 5 assets, export the file either as .csv or .xlsx Insert the code chunk below.
d083b824991d550f5f927c100a18f411
{ "intermediate": 0.45332208275794983, "beginner": 0.34185096621513367, "expert": 0.20482699573040009 }
45,268
package main import ( "bytes" "encoding/binary" "encoding/json" "errors" "fmt" "io" "io/ioutil" "math" "net" "net/http" "net/url" "os" "path/filepath" "regexp" "sort" "strings" "time" "github.com/abema/go-mp4" "github.com/grafov/m3u8" ) const ( defaultId = "0" prefetchKey = "skd://itunes.apple.com/P000000000/s1/e1" ) var ( forbiddenNames = regexp.MustCompile(`[/\\<>:"|?*]`) ) type SampleInfo struct { data []byte duration uint32 descIndex uint32 } type SongInfo struct { r io.ReadSeeker alacParam *Alac samples []SampleInfo } func (s *SongInfo) Duration() (ret uint64) { for i := range s.samples { ret += uint64(s.samples[i].duration) } return } func (*Alac) GetType() mp4.BoxType { return BoxTypeAlac() } func fileExists(path string) (bool, error) { f, err := os.Stat(path) if err == nil { return !f.IsDir(), nil } else if os.IsNotExist(err) { return false, nil } return false, err } func writeM4a(w *mp4.Writer, info *SongInfo, meta *AutoGenerated, data []byte, trackNum, trackTotal int) error { index := trackNum - 1 { // ftyp box, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeFtyp()}) if err != nil { return err } _, err = mp4.Marshal(w, &mp4.Ftyp{ MajorBrand: [4]byte{'M', '4', 'A', ' '}, MinorVersion: 0, CompatibleBrands: []mp4.CompatibleBrandElem{ {CompatibleBrand: [4]byte{'M', '4', 'A', ' '}}, {CompatibleBrand: [4]byte{'m', 'p', '4', '2'}}, {CompatibleBrand: mp4.BrandISOM()}, {CompatibleBrand: [4]byte{0, 0, 0, 0}}, }, }, box.Context) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } const chunkSize uint32 = 5 duration := info.Duration() numSamples := uint32(len(info.samples)) var stco *mp4.BoxInfo { // moov _, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeMoov()}) if err != nil { return err } box, err := mp4.ExtractBox(info.r, nil, mp4.BoxPath{mp4.BoxTypeMoov()}) if err != nil { return err } moovOri := box[0] { // mvhd _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeMvhd()}) if err != nil { return err } oriBox, err := mp4.ExtractBoxWithPayload(info.r, moovOri, mp4.BoxPath{mp4.BoxTypeMvhd()}) if err != nil { return err } mvhd := oriBox[0].Payload.(*mp4.Mvhd) if mvhd.Version == 0 { mvhd.DurationV0 = uint32(duration) } else { mvhd.DurationV1 = duration } _, err = mp4.Marshal(w, mvhd, oriBox[0].Info.Context) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } { // trak _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeTrak()}) if err != nil { return err } box, err := mp4.ExtractBox(info.r, moovOri, mp4.BoxPath{mp4.BoxTypeTrak()}) if err != nil { return err } trakOri := box[0] { // tkhd _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeTkhd()}) if err != nil { return err } oriBox, err := mp4.ExtractBoxWithPayload(info.r, trakOri, mp4.BoxPath{mp4.BoxTypeTkhd()}) if err != nil { return err } tkhd := oriBox[0].Payload.(*mp4.Tkhd) if tkhd.Version == 0 { tkhd.DurationV0 = uint32(duration) } else { tkhd.DurationV1 = duration } tkhd.SetFlags(0x7) _, err = mp4.Marshal(w, tkhd, oriBox[0].Info.Context) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } { // mdia _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeMdia()}) if err != nil { return err } box, err := mp4.ExtractBox(info.r, trakOri, mp4.BoxPath{mp4.BoxTypeMdia()}) if err != nil { return err } mdiaOri := box[0] { // mdhd _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeMdhd()}) if err != nil { return err } oriBox, err := mp4.ExtractBoxWithPayload(info.r, mdiaOri, mp4.BoxPath{mp4.BoxTypeMdhd()}) if err != nil { return err } mdhd := oriBox[0].Payload.(*mp4.Mdhd) if mdhd.Version == 0 { mdhd.DurationV0 = uint32(duration) } else { mdhd.DurationV1 = duration } _, err = mp4.Marshal(w, mdhd, oriBox[0].Info.Context) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } { // hdlr oriBox, err := mp4.ExtractBox(info.r, mdiaOri, mp4.BoxPath{mp4.BoxTypeHdlr()}) if err != nil { return err } err = w.CopyBox(info.r, oriBox[0]) if err != nil { return err } } { // minf _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeMinf()}) if err != nil { return err } box, err := mp4.ExtractBox(info.r, mdiaOri, mp4.BoxPath{mp4.BoxTypeMinf()}) if err != nil { return err } minfOri := box[0] { // smhd, dinf boxes, err := mp4.ExtractBoxes(info.r, minfOri, []mp4.BoxPath{ {mp4.BoxTypeSmhd()}, {mp4.BoxTypeDinf()}, }) if err != nil { return err } for _, b := range boxes { err = w.CopyBox(info.r, b) if err != nil { return err } } } { // stbl _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeStbl()}) if err != nil { return err } { // stsd box, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeStsd()}) if err != nil { return err } _, err = mp4.Marshal(w, &mp4.Stsd{EntryCount: 1}, box.Context) if err != nil { return err } { // alac _, err = w.StartBox(&mp4.BoxInfo{Type: BoxTypeAlac()}) if err != nil { return err } _, err = w.Write([]byte{ 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0}) if err != nil { return err } err = binary.Write(w, binary.BigEndian, uint16(info.alacParam.NumChannels)) if err != nil { return err } err = binary.Write(w, binary.BigEndian, uint16(info.alacParam.BitDepth)) if err != nil { return err } _, err = w.Write([]byte{0, 0}) if err != nil { return err } err = binary.Write(w, binary.BigEndian, info.alacParam.SampleRate) if err != nil { return err } _, err = w.Write([]byte{0, 0}) if err != nil { return err } box, err := w.StartBox(&mp4.BoxInfo{Type: BoxTypeAlac()}) if err != nil { return err } _, err = mp4.Marshal(w, info.alacParam, box.Context) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } _, err = w.EndBox() if err != nil { return err } } { // stts box, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeStts()}) if err != nil { return err } var stts mp4.Stts for _, sample := range info.samples { if len(stts.Entries) != 0 { last := &stts.Entries[len(stts.Entries)-1] if last.SampleDelta == sample.duration { last.SampleCount++ continue } } stts.Entries = append(stts.Entries, mp4.SttsEntry{ SampleCount: 1, SampleDelta: sample.duration, }) } stts.EntryCount = uint32(len(stts.Entries)) _, err = mp4.Marshal(w, &stts, box.Context) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } { // stsc box, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeStsc()}) if err != nil { return err } if numSamples%chunkSize == 0 { _, err = mp4.Marshal(w, &mp4.Stsc{ EntryCount: 1, Entries: []mp4.StscEntry{ { FirstChunk: 1, SamplesPerChunk: chunkSize, SampleDescriptionIndex: 1, }, }, }, box.Context) } else { _, err = mp4.Marshal(w, &mp4.Stsc{ EntryCount: 2, Entries: []mp4.StscEntry{ { FirstChunk: 1, SamplesPerChunk: chunkSize, SampleDescriptionIndex: 1, }, { FirstChunk: numSamples/chunkSize + 1, SamplesPerChunk: numSamples % chunkSize, SampleDescriptionIndex: 1, }, }, }, box.Context) } _, err = w.EndBox() if err != nil { return err } } { // stsz box, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeStsz()}) if err != nil { return err } stsz := mp4.Stsz{SampleCount: numSamples} for _, sample := range info.samples { stsz.EntrySize = append(stsz.EntrySize, uint32(len(sample.data))) } _, err = mp4.Marshal(w, &stsz, box.Context) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } { // stco box, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeStco()}) if err != nil { return err } l := (numSamples + chunkSize - 1) / chunkSize _, err = mp4.Marshal(w, &mp4.Stco{ EntryCount: l, ChunkOffset: make([]uint32, l), }, box.Context) stco, err = w.EndBox() if err != nil { return err } } _, err = w.EndBox() if err != nil { return err } } _, err = w.EndBox() if err != nil { return err } } _, err = w.EndBox() if err != nil { return err } } _, err = w.EndBox() if err != nil { return err } } { // udta ctx := mp4.Context{UnderUdta: true} _, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeUdta(), Context: ctx}) if err != nil { return err } { // meta ctx.UnderIlstMeta = true _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeMeta(), Context: ctx}) if err != nil { return err } _, err = mp4.Marshal(w, &mp4.Meta{}, ctx) if err != nil { return err } { // hdlr _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeHdlr(), Context: ctx}) if err != nil { return err } _, err = mp4.Marshal(w, &mp4.Hdlr{ HandlerType: [4]byte{'m', 'd', 'i', 'r'}, Reserved: [3]uint32{0x6170706c, 0, 0}, }, ctx) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } { // ilst ctx.UnderIlst = true _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeIlst(), Context: ctx}) if err != nil { return err } marshalData := func(val interface{}) error { _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeData()}) if err != nil { return err } var boxData mp4.Data switch v := val.(type) { case string: boxData.DataType = mp4.DataTypeStringUTF8 boxData.Data = []byte(v) case uint8: boxData.DataType = mp4.DataTypeSignedIntBigEndian boxData.Data = []byte{v} case uint32: boxData.DataType = mp4.DataTypeSignedIntBigEndian boxData.Data = make([]byte, 4) binary.BigEndian.PutUint32(boxData.Data, v) case []byte: boxData.DataType = mp4.DataTypeBinary boxData.Data = v default: panic("unsupported value") } _, err = mp4.Marshal(w, &boxData, ctx) if err != nil { return err } _, err = w.EndBox() return err } addMeta := func(tag mp4.BoxType, val interface{}) error { _, err = w.StartBox(&mp4.BoxInfo{Type: tag}) if err != nil { return err } err = marshalData(val) if err != nil { return err } _, err = w.EndBox() return err } addExtendedMeta := func(name string, val interface{}) error { ctx.UnderIlstFreeMeta = true _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxType{'-', '-', '-', '-'}, Context: ctx}) if err != nil { return err } { _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxType{'m', 'e', 'a', 'n'}, Context: ctx}) if err != nil { return err } _, err = w.Write([]byte{0, 0, 0, 0}) if err != nil { return err } _, err = io.WriteString(w, "com.apple.iTunes") if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } { _, err = w.StartBox(&mp4.BoxInfo{Type: mp4.BoxType{'n', 'a', 'm', 'e'}, Context: ctx}) if err != nil { return err } _, err = w.Write([]byte{0, 0, 0, 0}) if err != nil { return err } _, err = io.WriteString(w, name) if err != nil { return err } _, err = w.EndBox() if err != nil { return err } } err = marshalData(val) if err != nil { return err } ctx.UnderIlstFreeMeta = false _, err = w.EndBox() return err } err = addMeta(mp4.BoxType{'\251', 'n', 'a', 'm'}, meta.Data[0].Relationships.Tracks.Data[index].Attributes.Name) if err != nil { return err } err = addMeta(mp4.BoxType{'\251', 'a', 'l', 'b'}, meta.Data[0].Attributes.Name) if err != nil { return err } err = addMeta(mp4.BoxType{'\251', 'A', 'R', 'T'}, meta.Data[0].Relationships.Tracks.Data[index].Attributes.ArtistName) if err != nil { return err } err = addMeta(mp4.BoxType{'\251', 'w', 'r', 't'}, meta.Data[0].Relationships.Tracks.Data[index].Attributes.ComposerName) if err != nil { return err } err = addMeta(mp4.BoxType{'\251', 'd', 'a', 'y'}, strings.Split(meta.Data[0].Attributes.ReleaseDate, "-")[0]) if err != nil { return err } // cnID, err := strconv.ParseUint(meta.Data[0].Relationships.Tracks.Data[index].ID, 10, 32) // if err != nil { // return err // } // err = addMeta(mp4.BoxType{'c', 'n', 'I', 'D'}, uint32(cnID)) // if err != nil { // return err // } err = addExtendedMeta("ISRC", meta.Data[0].Relationships.Tracks.Data[index].Attributes.Isrc) if err != nil { return err } if len(meta.Data[0].Relationships.Tracks.Data[index].Attributes.GenreNames) > 0 { err = addMeta(mp4.BoxType{'\251', 'g', 'e', 'n'}, meta.Data[0].Relationships.Tracks.Data[index].Attributes.GenreNames[0]) if err != nil { return err } } if len(meta.Data) > 0 { album := meta.Data[0] err = addMeta(mp4.BoxType{'a', 'A', 'R', 'T'}, album.Attributes.ArtistName) if err != nil { return err } err = addMeta(mp4.BoxType{'c', 'p', 'r', 't'}, album.Attributes.Copyright) if err != nil { return err } var isCpil uint8 if album.Attributes.IsCompilation { isCpil = 1 } err = addMeta(mp4.BoxType{'c', 'p', 'i', 'l'}, isCpil) if err != nil { return err } err = addExtendedMeta("LABEL", album.Attributes.RecordLabel) if err != nil { return err } err = addExtendedMeta("UPC", album.Attributes.Upc) if err != nil { return err } // plID, err := strconv.ParseUint(album.ID, 10, 32) // if err != nil { // return err // } // err = addMeta(mp4.BoxType{'p', 'l', 'I', 'D'}, uint32(plID)) // if err != nil { // return err // } } // if len(meta.Data[0].Relationships.Artists.Data) > 0 { // atID, err := strconv.ParseUint(meta.Data[0].Relationships.Artists.Data[index].ID, 10, 32) // if err != nil { // return err // } // err = addMeta(mp4.BoxType{'a', 't', 'I', 'D'}, uint32(atID)) // if err != nil { // return err // } // } trkn := make([]byte, 8) binary.BigEndian.PutUint32(trkn, uint32(trackNum)) binary.BigEndian.PutUint16(trkn[4:], uint16(trackTotal)) err = addMeta(mp4.BoxType{'t', 'r', 'k', 'n'}, trkn) if err != nil { return err } // disk := make([]byte, 8) // binary.BigEndian.PutUint32(disk, uint32(meta.Attributes.DiscNumber)) // err = addMeta(mp4.BoxType{'d', 'i', 's', 'k'}, disk) // if err != nil { // return err // } ctx.UnderIlst = false _, err = w.EndBox() if err != nil { return err } } ctx.UnderIlstMeta = false _, err = w.EndBox() if err != nil { return err } } ctx.UnderUdta = false _, err = w.EndBox() if err != nil { return err } } _, err = w.EndBox() if err != nil { return err } } { box, err := w.StartBox(&mp4.BoxInfo{Type: mp4.BoxTypeMdat()}) if err != nil { return err } _, err = mp4.Marshal(w, &mp4.Mdat{Data: data}, box.Context) if err != nil { return err } mdat, err := w.EndBox() var realStco mp4.Stco offset := mdat.Offset + mdat.HeaderSize for i := uint32(0); i < numSamples; i++ { if i%chunkSize == 0 { realStco.EntryCount++ realStco.ChunkOffset = append(realStco.ChunkOffset, uint32(offset)) } offset += uint64(len(info.samples[i].data)) } _, err = stco.SeekToPayload(w) if err != nil { return err } _, err = mp4.Marshal(w, &realStco, box.Context) if err != nil { return err } } return nil } func decryptSong(info *SongInfo, keys []string, manifest *AutoGenerated, filename string, trackNum, trackTotal int) error { //fmt.Printf("%d-bit / %d Hz\n", info.bitDepth, info.bitRate) conn, err := net.Dial("tcp", "127.0.0.1:10020") if err != nil { return err } defer conn.Close() var decrypted []byte var lastIndex uint32 = math.MaxUint8 fmt.Println("Decrypt start.") for _, sp := range info.samples { if lastIndex != sp.descIndex { if len(decrypted) != 0 { _, err := conn.Write([]byte{0, 0, 0, 0}) if err != nil { return err } } keyUri := keys[sp.descIndex] id := manifest.Data[0].Relationships.Tracks.Data[trackNum-1].ID if keyUri == prefetchKey { id = defaultId } _, err := conn.Write([]byte{byte(len(id))}) if err != nil { return err } _, err = io.WriteString(conn, id) if err != nil { return err } _, err = conn.Write([]byte{byte(len(keyUri))}) if err != nil { return err } _, err = io.WriteString(conn, keyUri) if err != nil { return err } } lastIndex = sp.descIndex err := binary.Write(conn, binary.LittleEndian, uint32(len(sp.data))) if err != nil { return err } _, err = conn.Write(sp.data) if err != nil { return err } de := make([]byte, len(sp.data)) _, err = io.ReadFull(conn, de) if err != nil { return err } decrypted = append(decrypted, de...) } _, _ = conn.Write([]byte{0, 0, 0, 0, 0}) fmt.Println("Decrypt finished.") create, err := os.Create(filename) if err != nil { return err } defer create.Close() return writeM4a(mp4.NewWriter(create), info, manifest, decrypted, trackNum, trackTotal) } func checkUrl(url string) (string, string) { pat := regexp.MustCompile(`^(?:https:\/\/(?:beta\.music|music)\.apple\.com\/(\w{2})(?:\/album|\/album\/.+))\/(?:id)?(\d[^\D]+)(?:$|\?)`) matches := pat.FindAllStringSubmatch(url, -1) if matches == nil { return "", "" } else { return matches[0][1], matches[0][2] } } func getMeta(albumId string, token string, storefront string) (*AutoGenerated, error) { req, err := http.NewRequest("GET", fmt.Sprintf("https://amp-api.music.apple.com/v1/catalog/%s/albums/%s", storefront, albumId), nil) if err != nil { return nil, err } req.Header.Set("Authorization", fmt.Sprintf("Bearer %s", token)) req.Header.Set("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36") req.Header.Set("Origin", "https://music.apple.com") query := url.Values{} query.Set("omit[resource]", "autos") query.Set("include", "tracks,artists,record-labels") query.Set("include[songs]", "artists") query.Set("fields[artists]", "name") query.Set("fields[albums:albums]", "artistName,artwork,name,releaseDate,url") query.Set("fields[record-labels]", "name") // query.Set("l", "en-gb") req.URL.RawQuery = query.Encode() do, err := http.DefaultClient.Do(req) if err != nil { return nil, err } defer do.Body.Close() if do.StatusCode != http.StatusOK { return nil, errors.New(do.Status) } obj := new(AutoGenerated) err = json.NewDecoder(do.Body).Decode(&obj) if err != nil { return nil, err } return obj, nil } func writeCover(sanAlbumFolder, url string) error { covPath := filepath.Join(sanAlbumFolder, "cover.jpg") exists, err := fileExists(covPath) if err != nil { fmt.Println("Failed to check if cover exists.") return err } if exists { return nil } url = strings.Replace(url, "{w}x{h}", "1200x12000", 1) req, err := http.NewRequest("GET", url, nil) if err != nil { return err } req.Header.Set("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36") do, err := http.DefaultClient.Do(req) if err != nil { return err } defer do.Body.Close() if do.StatusCode != http.StatusOK { errors.New(do.Status) } f, err := os.Create(covPath) if err != nil { return err } defer f.Close() _, err = io.Copy(f, do.Body) if err != nil { return err } return nil } func rip(albumId string, token string, storefront string) error { meta, err := getMeta(albumId, token, storefront) if err != nil { fmt.Println("Failed to get album metadata.\n") return err } albumFolder := fmt.Sprintf("%s - %s", meta.Data[0].Attributes.ArtistName, meta.Data[0].Attributes.Name) sanAlbumFolder := filepath.Join("AM-DL downloads", forbiddenNames.ReplaceAllString(albumFolder, "_")) os.MkdirAll(sanAlbumFolder, os.ModePerm) fmt.Println(albumFolder) err = writeCover(sanAlbumFolder, meta.Data[0].Attributes.Artwork.URL) if err != nil { fmt.Println("Failed to write cover.") } trackTotal := len(meta.Data[0].Relationships.Tracks.Data) for trackNum, track := range meta.Data[0].Relationships.Tracks.Data { trackNum++ fmt.Printf("Track %d of %d:\n", trackNum, trackTotal) manifest, err := getInfoFromAdam(track.ID, token, storefront) if err != nil { fmt.Println("Failed to get manifest.\n", err) continue } if manifest.Attributes.ExtendedAssetUrls.EnhancedHls == "" { fmt.Println("Unavailable in ALAC.") continue } filename := fmt.Sprintf("%02d. %s.m4a", trackNum, forbiddenNames.ReplaceAllString(track.Attributes.Name, "_")) trackPath := filepath.Join(sanAlbumFolder, filename) exists, err := fileExists(trackPath) if err != nil { fmt.Println("Failed to check if track exists.") } if exists { fmt.Println("Track already exists locally.") continue } trackUrl, keys, err := extractMedia(manifest.Attributes.ExtendedAssetUrls.EnhancedHls) if err != nil { fmt.Println("Failed to extract info from manifest.\n", err) continue } info, err := extractSong(trackUrl) if err != nil { fmt.Println("Failed to extract track.", err) continue } samplesOk := true for samplesOk { for _, i := range info.samples { if int(i.descIndex) >= len(keys) { fmt.Println("Decryption size mismatch.") samplesOk = false } } break } if !samplesOk { continue } err = decryptSong(info, keys, meta, trackPath, trackNum, trackTotal) if err != nil { fmt.Println("Failed to decrypt track.\n", err) continue } } return err } func main() { token, err := getToken() if err != nil { fmt.Println("Failed to get token.") return } albumTotal := len(os.Args[1:]) for albumNum, url := range os.Args[1:] { fmt.Printf("Album %d of %d:\n", albumNum+1, albumTotal) storefront, albumId := checkUrl(url) if albumId == "" { fmt.Printf("Invalid URL: %s\n", url) continue } err := rip(albumId, token, storefront) if err != nil { fmt.Println("Album failed.") fmt.Println(err) } } } func extractMedia(b string) (string, []string, error) { masterUrl, err := url.Parse(b) if err != nil { return "", nil, err } resp, err := http.Get(b) if err != nil { return "", nil, err } defer resp.Body.Close() if resp.StatusCode != http.StatusOK { return "", nil, errors.New(resp.Status) } body, err := io.ReadAll(resp.Body) if err != nil { return "", nil, err } masterString := string(body) from, listType, err := m3u8.DecodeFrom(strings.NewReader(masterString), true) if err != nil || listType != m3u8.MASTER { return "", nil, errors.New("m3u8 not of master type") } master := from.(*m3u8.MasterPlaylist) var streamUrl *url.URL sort.Slice(master.Variants, func(i, j int) bool { return master.Variants[i].AverageBandwidth > master.Variants[j].AverageBandwidth }) for _, variant := range master.Variants { if variant.Codecs == "alac" { split := strings.Split(variant.Audio, "-") length := len(split) fmt.Printf("%s-bit / %s Hz\n", split[length-1], split[length-2]) streamUrlTemp, err := masterUrl.Parse(variant.URI) if err != nil { panic(err) } streamUrl = streamUrlTemp break } } if streamUrl == nil { return "", nil, errors.New("no alac codec found") } var keys []string keys = append(keys, prefetchKey) streamUrl.Path = strings.TrimSuffix(streamUrl.Path, ".m3u8") + "_m.mp4" regex := regexp.MustCompile(`"(skd?://[^"]*)"`) matches := regex.FindAllStringSubmatch(masterString, -1) for _, match := range matches { if strings.HasSuffix(match[1], "c23") || strings.HasSuffix(match[1], "c6") { keys = append(keys, match[1]) } } return streamUrl.String(), keys, nil } func extractSong(url string) (*SongInfo, error) { fmt.Println("Downloading...") track, err := http.Get(url) if err != nil { return nil, err } defer track.Body.Close() if track.StatusCode != http.StatusOK { return nil, errors.New(track.Status) } rawSong, err := ioutil.ReadAll(track.Body) if err != nil { return nil, err } fmt.Println("Downloaded.") f := bytes.NewReader(rawSong) trex, err := mp4.ExtractBoxWithPayload(f, nil, []mp4.BoxType{ mp4.BoxTypeMoov(), mp4.BoxTypeMvex(), mp4.BoxTypeTrex(), }) if err != nil || len(trex) != 1 { return nil, err } trexPay := trex[0].Payload.(*mp4.Trex) stbl, err := mp4.ExtractBox(f, nil, []mp4.BoxType{ mp4.BoxTypeMoov(), mp4.BoxTypeTrak(), mp4.BoxTypeMdia(), mp4.BoxTypeMinf(), mp4.BoxTypeStbl(), }) if err != nil || len(stbl) != 1 { return nil, err } enca, err := mp4.ExtractBoxWithPayload(f, stbl[0], []mp4.BoxType{ mp4.BoxTypeStsd(), mp4.BoxTypeEnca(), }) if err != nil { return nil, err } aalac, err := mp4.ExtractBoxWithPayload(f, &enca[0].Info, []mp4.BoxType{BoxTypeAlac()}) if err != nil || len(aalac) != 1 { return nil, err } extracted := &SongInfo{ r: f, alacParam: aalac[0].Payload.(*Alac), } moofs, err := mp4.ExtractBox(f, nil, []mp4.BoxType{ mp4.BoxTypeMoof(), }) if err != nil || len(moofs) <= 0 { return nil, err } mdats, err := mp4.ExtractBoxWithPayload(f, nil, []mp4.BoxType{ mp4.BoxTypeMdat(), }) if err != nil || len(mdats) != len(moofs) { return nil, err } for i, moof := range moofs { tfhd, err := mp4.ExtractBoxWithPayload(f, moof, []mp4.BoxType{ mp4.BoxTypeTraf(), mp4.BoxTypeTfhd(), }) if err != nil || len(tfhd) != 1 { return nil, err } tfhdPay := tfhd[0].Payload.(*mp4.Tfhd) index := tfhdPay.SampleDescriptionIndex if index != 0 { index-- } truns, err := mp4.ExtractBoxWithPayload(f, moof, []mp4.BoxType{ mp4.BoxTypeTraf(), mp4.BoxTypeTrun(), }) if err != nil || len(truns) <= 0 { return nil, err } mdat := mdats[i].Payload.(*mp4.Mdat).Data for _, t := range truns { for _, en := range t.Payload.(*mp4.Trun).Entries { info := SampleInfo{descIndex: index} switch { case t.Payload.CheckFlag(0x200): info.data = mdat[:en.SampleSize] mdat = mdat[en.SampleSize:] case tfhdPay.CheckFlag(0x10): info.data = mdat[:tfhdPay.DefaultSampleSize] mdat = mdat[tfhdPay.DefaultSampleSize:] default: info.data = mdat[:trexPay.DefaultSampleSize] mdat = mdat[trexPay.DefaultSampleSize:] } switch { case t.Payload.CheckFlag(0x100): info.duration = en.SampleDuration case tfhdPay.CheckFlag(0x8): info.duration = tfhdPay.DefaultSampleDuration default: info.duration = trexPay.DefaultSampleDuration } extracted.samples = append(extracted.samples, info) } } if len(mdat) != 0 { return nil, errors.New("offset mismatch") } } return extracted, nil } func init() { mp4.AddBoxDef((*Alac)(nil)) } func BoxTypeAlac() mp4.BoxType { return mp4.StrToBoxType("alac") } type Alac struct { mp4.FullBox `mp4:"extend"` FrameLength uint32 `mp4:"size=32"` CompatibleVersion uint8 `mp4:"size=8"` BitDepth uint8 `mp4:"size=8"` Pb uint8 `mp4:"size=8"` Mb uint8 `mp4:"size=8"` Kb uint8 `mp4:"size=8"` NumChannels uint8 `mp4:"size=8"` MaxRun uint16 `mp4:"size=16"` MaxFrameBytes uint32 `mp4:"size=32"` AvgBitRate uint32 `mp4:"size=32"` SampleRate uint32 `mp4:"size=32"` } func getInfoFromAdam(adamId string, token string, storefront string) (*SongData, error) { request, err := http.NewRequest("GET", fmt.Sprintf("https://amp-api.music.apple.com/v1/catalog/%s/songs/%s", storefront, adamId), nil) if err != nil { return nil, err } query := url.Values{} query.Set("extend", "extendedAssetUrls") query.Set("include", "albums") request.URL.RawQuery = query.Encode() request.Header.Set("Authorization", fmt.Sprintf("Bearer %s", token)) request.Header.Set("User-Agent", "iTunes/12.11.3 (Windows; Microsoft Windows 10 x64 Professional Edition (Build 19041); x64) AppleWebKit/7611.1022.4001.1 (dt:2)") request.Header.Set("Origin", "https://music.apple.com") do, err := http.DefaultClient.Do(request) if err != nil { return nil, err } defer do.Body.Close() if do.StatusCode != http.StatusOK { return nil, errors.New(do.Status) } obj := new(ApiResult) err = json.NewDecoder(do.Body).Decode(&obj) if err != nil { return nil, err } for _, d := range obj.Data { if d.ID == adamId { return &d, nil } } return nil, nil } func getToken() (string, error) { req, err := http.NewRequest("GET", "https://beta.music.apple.com", nil) if err != nil { return "", err } resp, err := http.DefaultClient.Do(req) if err != nil { return "", err } defer resp.Body.Close() body, err := io.ReadAll(resp.Body) if err != nil { return "", err } regex := regexp.MustCompile(`/assets/index-legacy-[^/]+\.js`) indexJsUri := regex.FindString(string(body)) req, err = http.NewRequest("GET", "https://beta.music.apple.com"+indexJsUri, nil) if err != nil { return "", err } resp, err = http.DefaultClient.Do(req) if err != nil { return "", err } defer resp.Body.Close() body, err = io.ReadAll(resp.Body) if err != nil { return "", err } regex = regexp.MustCompile(`eyJh([^"]*)`) token := regex.FindString(string(body)) return token, nil } type ApiResult struct { Data []SongData `json:"data"` } type SongAttributes struct { ArtistName string `json:"artistName"` DiscNumber int `json:"discNumber"` GenreNames []string `json:"genreNames"` ExtendedAssetUrls struct { EnhancedHls string `json:"enhancedHls"` } `json:"extendedAssetUrls"` IsMasteredForItunes bool `json:"isMasteredForItunes"` ReleaseDate string `json:"releaseDate"` Name string `json:"name"` Isrc string `json:"isrc"` AlbumName string `json:"albumName"` TrackNumber int `json:"trackNumber"` ComposerName string `json:"composerName"` } type AlbumAttributes struct { ArtistName string `json:"artistName"` IsSingle bool `json:"isSingle"` IsComplete bool `json:"isComplete"` GenreNames []string `json:"genreNames"` TrackCount int `json:"trackCount"` IsMasteredForItunes bool `json:"isMasteredForItunes"` ReleaseDate string `json:"releaseDate"` Name string `json:"name"` RecordLabel string `json:"recordLabel"` Upc string `json:"upc"` Copyright string `json:"copyright"` IsCompilation bool `json:"isCompilation"` } type SongData struct { ID string `json:"id"` Attributes SongAttributes `json:"attributes"` Relationships struct { Albums struct { Data []struct { ID string `json:"id"` Type string `json:"type"` Href string `json:"href"` Attributes AlbumAttributes `json:"attributes"` } `json:"data"` } `json:"albums"` Artists struct { Href string `json:"href"` Data []struct { ID string `json:"id"` Type string `json:"type"` Href string `json:"href"` } `json:"data"` } `json:"artists"` } `json:"relationships"` } type SongResult struct { Artwork struct { Width int `json:"width"` URL string `json:"url"` Height int `json:"height"` TextColor3 string `json:"textColor3"` TextColor2 string `json:"textColor2"` TextColor4 string `json:"textColor4"` HasAlpha bool `json:"hasAlpha"` TextColor1 string `json:"textColor1"` BgColor string `json:"bgColor"` HasP3 bool `json:"hasP3"` SupportsLayeredImage bool `json:"supportsLayeredImage"` } `json:"artwork"` ArtistName string `json:"artistName"` CollectionID string `json:"collectionId"` DiscNumber int `json:"discNumber"` GenreNames []string `json:"genreNames"` ID string `json:"id"` DurationInMillis int `json:"durationInMillis"` ReleaseDate string `json:"releaseDate"` ContentRatingsBySystem struct { } `json:"contentRatingsBySystem"` Name string `json:"name"` Composer struct { Name string `json:"name"` URL string `json:"url"` } `json:"composer"` EditorialArtwork struct { } `json:"editorialArtwork"` CollectionName string `json:"collectionName"` AssetUrls struct { Plus string `json:"plus"` Lightweight string `json:"lightweight"` SuperLightweight string `json:"superLightweight"` LightweightPlus string `json:"lightweightPlus"` EnhancedHls string `json:"enhancedHls"` } `json:"assetUrls"` AudioTraits []string `json:"audioTraits"` Kind string `json:"kind"` Copyright string `json:"copyright"` ArtistID string `json:"artistId"` Genres []struct { GenreID string `json:"genreId"` Name string `json:"name"` URL string `json:"url"` MediaType string `json:"mediaType"` } `json:"genres"` TrackNumber int `json:"trackNumber"` AudioLocale string `json:"audioLocale"` Offers []struct { ActionText struct { Short string `json:"short"` Medium string `json:"medium"` Long string `json:"long"` Downloaded string `json:"downloaded"` Downloading string `json:"downloading"` } `json:"actionText"` Type string `json:"type"` PriceFormatted string `json:"priceFormatted"` Price float64 `json:"price"` BuyParams string `json:"buyParams"` Variant string `json:"variant,omitempty"` Assets []struct { Flavor string `json:"flavor"` Preview struct { Duration int `json:"duration"` URL string `json:"url"` } `json:"preview"` Size int `json:"size"` Duration int `json:"duration"` } `json:"assets"` } `json:"offers"` } type iTunesLookup struct { Results map[string]SongResult `json:"results"` } type Meta struct { Context string `json:"@context"` Type string `json:"@type"` Name string `json:"name"` Description string `json:"description"` Tracks []struct { Type string `json:"@type"` Name string `json:"name"` Audio struct { Type string `json:"@type"` } `json:"audio"` Offers struct { Type string `json:"@type"` Category string `json:"category"` Price int `json:"price"` } `json:"offers"` Duration string `json:"duration"` } `json:"tracks"` Citation []interface{} `json:"citation"` WorkExample []struct { Type string `json:"@type"` Name string `json:"name"` URL string `json:"url"` Audio struct { Type string `json:"@type"` } `json:"audio"` Offers struct { Type string `json:"@type"` Category string `json:"category"` Price int `json:"price"` } `json:"offers"` Duration string `json:"duration"` } `json:"workExample"` Genre []string `json:"genre"` DatePublished time.Time `json:"datePublished"` ByArtist struct { Type string `json:"@type"` URL string `json:"url"` Name string `json:"name"` } `json:"byArtist"` } type AutoGenerated struct { Data []struct { ID string `json:"id"` Type string `json:"type"` Href string `json:"href"` Attributes struct { Artwork struct { Width int `json:"width"` Height int `json:"height"` URL string `json:"url"` BgColor string `json:"bgColor"` TextColor1 string `json:"textColor1"` TextColor2 string `json:"textColor2"` TextColor3 string `json:"textColor3"` TextColor4 string `json:"textColor4"` } `json:"artwork"` ArtistName string `json:"artistName"` IsSingle bool `json:"isSingle"` URL string `json:"url"` IsComplete bool `json:"isComplete"` GenreNames []string `json:"genreNames"` TrackCount int `json:"trackCount"` IsMasteredForItunes bool `json:"isMasteredForItunes"` ReleaseDate string `json:"releaseDate"` Name string `json:"name"` RecordLabel string `json:"recordLabel"` Upc string `json:"upc"` AudioTraits []string `json:"audioTraits"` Copyright string `json:"copyright"` PlayParams struct { ID string `json:"id"` Kind string `json:"kind"` } `json:"playParams"` IsCompilation bool `json:"isCompilation"` } `json:"attributes"` Relationships struct { RecordLabels struct { Href string `json:"href"` Data []interface{} `json:"data"` } `json:"record-labels"` Artists struct { Href string `json:"href"` Data []struct { ID string `json:"id"` Type string `json:"type"` Href string `json:"href"` Attributes struct { Name string `json:"name"` } `json:"attributes"` } `json:"data"` } `json:"artists"` Tracks struct { Href string `json:"href"` Data []struct { ID string `json:"id"` Type string `json:"type"` Href string `json:"href"` Attributes struct { Previews []struct { URL string `json:"url"` } `json:"previews"` Artwork struct { Width int `json:"width"` Height int `json:"height"` URL string `json:"url"` BgColor string `json:"bgColor"` TextColor1 string `json:"textColor1"` TextColor2 string `json:"textColor2"` TextColor3 string `json:"textColor3"` TextColor4 string `json:"textColor4"` } `json:"artwork"` ArtistName string `json:"artistName"` URL string `json:"url"` DiscNumber int `json:"discNumber"` GenreNames []string `json:"genreNames"` HasTimeSyncedLyrics bool `json:"hasTimeSyncedLyrics"` IsMasteredForItunes bool `json:"isMasteredForItunes"` DurationInMillis int `json:"durationInMillis"` ReleaseDate string `json:"releaseDate"` Name string `json:"name"` Isrc string `json:"isrc"` AudioTraits []string `json:"audioTraits"` HasLyrics bool `json:"hasLyrics"` AlbumName string `json:"albumName"` PlayParams struct { ID string `json:"id"` Kind string `json:"kind"` } `json:"playParams"` TrackNumber int `json:"trackNumber"` AudioLocale string `json:"audioLocale"` ComposerName string `json:"composerName"` } `json:"attributes"` Relationships struct { Artists struct { Href string `json:"href"` Data []struct { ID string `json:"id"` Type string `json:"type"` Href string `json:"href"` Attributes struct { Name string `json:"name"` } `json:"attributes"` } `json:"data"` } `json:"artists"` } `json:"relationships"` } `json:"data"` } `json:"tracks"` } `json:"relationships"` } `json:"data"` } this does something, but i need to add functionality like to download all my songs which is in my library add this function to this code
c197dff84cfd71e98da02267f97cd758
{ "intermediate": 0.33890464901924133, "beginner": 0.44517913460731506, "expert": 0.21591626107692719 }
45,269
How do i get the feasible distance in CISCO CLI for EIGRP?
7ec2a71e8dcf5291b4e7683f40d9852a
{ "intermediate": 0.3801356852054596, "beginner": 0.13829687237739563, "expert": 0.48156747221946716 }
45,270
아래 문제를 해결하고 결국 작성해야하는 코드를 출력해주시오. ## 0 - Introduction Welcome to the <span style="color:yellowgreen">Foundations of Machine Learning</span> (ECE5984_41) course! This is the <span style="color:red">first</span> lab practice for this class. You will implement linear regression algorithm and apply it to the provided dataset. ### Outline - [ 1 - Packages ](#1-packages) - [ 2 - Problem Statement and Dataset ](#2-problem-statement-and-dataset) - [ 3 - Linear Regression ](#3-linear-regression) - [ 3.1 Normal Equation ](#31-normal-equation) - Exercise 1 - [ 3.2 Gradient Descent ](#32-gradient-descent) - Exercise 2, Exercise 3, Exercise 4 ## 1. Packages You have to install and use below packages for your HW#1. - [numpy](https://www.numpy.org): Fundamental package for matrix computation with python. - [matplotlib](https://matplotlib.org): Package for visualization of graph with python. - [pandas](https://pandas.pydata.org): Open source package for data analysis and manipulation. **Do not use other machine learning packages in this homework, e.g., *tensorflow*, *pytorch*, *jax*, etc.** import math import copy import numpy as np import matplotlib.pyplot as plt import pandas as pd %matplotlib inline ## 2. Problem Statement and Dataset ### Problem Statement [Student Marks Dataset](https://www.kaggle.com/datasets/nikhil7280/student-performance-multiple-linear-regression) The dataset consists 100 samples of Marks(Grades) of students including their "study time" & "number of courses". In this homework, we would like to fit the linear regression model on this dataset. For practice, we will only use "study time" attribute. ### Dataset You will start by loading the dataset. Make sure the attached 'Student_Marks.csv' file is in your current directory path (can be downloaded from icampus). Each column describes the below attributes: - \[ time_study \]: Total hour the student has studied. - \[ Marks \]: The final score the student has achieved. Run below cells and take a look at the dataset! # Load the dataset dataset = pd.read_csv("./Student_Marks.csv") # Display top 5 elements dataset.head(5) # check data type of each attributes dataset.dtypes # check dataset shape: (num data, num attributes) dataset.shape # Extract "Hours Studied" and "Performance Index" attributes. # Each attributes will be the data and label, respectively. data = dataset["time_study"].to_numpy() label = dataset["Marks"].to_numpy() # Check data statistics print("The shape of data is", data.shape) print("The shape of label is", label.shape) print() print("The first Five elements of data are", data[:5]) print("The first Five elements of label are", label[:5]) print() # Plot data plt.scatter(data, label, marker='x', c='r') plt.title("Study Time - Grade") plt.xlabel("Study Time") plt.ylabel("Grade") plt.show() ## 3. Linear Regression ### 3.1 Normal Equation ### <span style="color:#ffd33d"> Exercise 1 </span>: Complete the code for the normal equation # Complete the code for the normal equation def normal_equation(data, label): """ Directly compute optimal w* = (w, b) of the linear regression model Args: data (np.ndarray): Shape (N, ) Input ot the model label (np.ndarray): Shape (N, ) Label of the data Return: w (np.ndarray): weight(slope) of the linear regression model b (np.ndarray): bias of the linear regression model """ #[NOTE] write your from code here! return w, b w, b = normal_equation(data, label) print(f"w: {w:.6f}, b: {b:.6f}") Plot your linear regression model and check if it works well # Obtain weight and bias of your linear regression model w, b = normal_equation(data, label) # Plot data plt.scatter(data, label, marker='x', c='r') plt.title("Study Time - Grade") plt.xlabel("Study Time") plt.ylabel("Grade") x_min, x_max = 0, math.ceil(max(data)) y_min, y_max = 0, math.ceil(max(label)) x = np.array([x_min, x_max]) y = w * x + b plt.plot(x, y) plt.show() ### 3.2 Gradient Descent ### <span style="color:#ffd33d"> Exercise 2 </span>: Complete the code for computing the loss(cost) \> Here, the <span style="color:skyblue">compute_loss()</span> method computes the loss of the linear regression model. Feel free to change or add variables, but do not modify the method name and its arguements. def compute_loss(data, label, w, b): """ Compute the loss(cost) of the linear regression model, given data and label. Args: data (np.ndarray): Shape (N, ) Input to the model label (np.ndarray): Shape (N, ) Label of the data w (float): Weight of the linear regression model b (float): Bias of the linear regression model Return: total_loss (float): Total loss of the linear regression model, given data and label """ total_loss = 0 #[NOTE] write your code here! return total_loss ### <span style="color:#ffd33d"> Exercise 4 </span>: Complete the code for computing the gradient of the loss w.r.t weights and bias \> Here, the <span style="color:skyblue">compute_gradient()</span> method computes the gradient of the loss with respect to the parameters of the linear regression model. Feel free to change or add variables, but do not modify the method name and its arguments. def compute_gradient(data, label, w, b): """ Compute the loss(cost) of the linear regression model, given data and label. Args: data (np.ndarray): Shape (N, ) Input to the model label (np.ndarray): Shape (N, ) Label of the data w (float): Weight of the linear regression model b (float): Bias of the linear regression model Return: grad_w (float): The gradient of the loss w.r.t weight w grad_b (float): The gradient of the loss w.r.t bias b """ grad_w, grad_b = 0, 0 #[NOTE] write your code here! return grad_w, grad_b ### <span style="color:#ffd33d"> Exercise 4 </span>: Complete the code of gradient descent algorithm \> <span style="color:skyblue">gradient_descent()</span> method applies gradient descent on the given dataset. You should make 'loop' inside this method to iteratively update w and b. You __**must**__ use <span style="color:skyblue">compute_loss()</span> and <span style="color:skyblue">compute_gradient()</span> method to calculate loss and gradient, respectively. Also, for the purpose of visualization of the training curve, this method returns some 'history' lists. You can just simply append updated values to the corresponding 'history' list. Feel free to change or add variables, but do not modify the method name, its arguments, and return values. def gradient_descent(data, label, w_init, b_init, iters=1500, lr=0.0001): """ Performs batch gradient descent to obatain weight and bias of the linear regression model. Args: data (np.ndarray): Shape (N,) label (np.ndarray): Shape (N,) w_init (float): Initial value of weight of the model b_init (float): Initial values of bias of the model lr (float): Learning rate iters (int): Number of iterations to run gradient descent Returns w (float): Weight of the 1D linear regression model obtained with BGD b (float): Bias of the 1D linear regression model obtained with BGD loss_history (list): loss values of every iteration steps w_history (list): w values of every iteration steps b_history (list): b values of every iteration steps """ loss_history = [] w_history = [] b_history = [] w = w_init b = b_init loss = 0 for i in range(iters): #[NOTE] write your code here! # print loss for every 100 iterations if i % 100 == 99: print(f"[ {i + 1:4}/{iters} ] Loss: {loss:.4f} | w: {w:.4f} | b: {b:.4f}") return w, b, loss_history, w_history, b_history Now apply your gradient descent algorithm. (Also try other hyperparameter values!) # initial values of w and b w_init, b_init = .0, .0 # hyperparameters for gradient descent algorithm iters = 1500 learning_rate = 0.0001 # obtain w and b with gradient descent w_gd, b_gd, loss_history, w_history, b_history = gradient_descent( data, label, w_init, b_init, iters=iters, lr=learning_rate, ) print("w,b found by gradient descent:", w_gd, b_gd) # obtain w and b with normal equation w_ne, b_ne = normal_equation(data, label) print("w,b found by normal equation:", w_ne, b_ne) loss_history = loss_history[100:] # it is okay to remove this line # plot training curve plt.plot(range(len(loss_history)), loss_history) plt.title("Training Curve") plt.xlabel("Iter") plt.ylabel("Loss") plt.show()
6b695625c818e4a305c1820ba694a485
{ "intermediate": 0.33538728952407837, "beginner": 0.31918978691101074, "expert": 0.3454229235649109 }
45,271
как сделать if "go" in "I AM GOING" в Arduino
4846abc0d62fcd9b692c086d3de08289
{ "intermediate": 0.3196963667869568, "beginner": 0.34285208582878113, "expert": 0.3374515473842621 }
45,272
how to del C:\Folder fully
84bb5047507e47172816aafd242e16d5
{ "intermediate": 0.2946944236755371, "beginner": 0.37864014506340027, "expert": 0.32666537165641785 }
45,273
create a Tkinter consele that print everything in my script to that colsole
37990b65188e447c8ea6182e6cecd7c2
{ "intermediate": 0.3562793731689453, "beginner": 0.3113304376602173, "expert": 0.3323901295661926 }
45,274
Hi, I have a GItlab self-hosted server and I would like to create and register a runner to use CI/CD in a project of mine. Do you know have to do?
052f258a9e39e36f8346f63e33bfed4d
{ "intermediate": 0.5581840872764587, "beginner": 0.13997510075569153, "expert": 0.3018408417701721 }
45,275
потраченную на него, в отсортированном по убыванию этой суммы виде. Список платежей находится в таблице Payments . Для вывода суммы используйте псевдоним sum . Схема базы данных FamilyMembers © Payments © P payment_id INT member_id INTI VARCHAR O family_member INT © good INT O member_name VARCHAR ® amount INT O birthday DATETIME @ unit_price INT @ date DATETIME O Goods © GoodTypes © P good_id INT + P good_type_id INTI good_name VARCHAR O good_type_na VARCHAR ® type INT ©
49ef698707452d8cc7ee6a64f491e66a
{ "intermediate": 0.35669946670532227, "beginner": 0.2542106509208679, "expert": 0.3890898525714874 }
45,276
Flask: return body of post request
482be619c21036f1aac1a8fad9173930
{ "intermediate": 0.3446630835533142, "beginner": 0.32461902499198914, "expert": 0.3307178318500519 }
45,277
Для каждого отдельного платежа выведите идентификатор товара и сумму, потраченную на него, в отсортированном по убыванию этой суммы виде. Список платежей находится в таблице Payments . Для вывода суммы используйте псевдоним sum . Схема базы данных FamilyMembers © Payments © P payment_id INT member_id INTI VARCHAR O family_member INT © good INT O member_name VARCHAR ® amount birthday DATETIME @ unit_price date DATETIME O Goods © GoodTypes © P good_id + P good_type_id INTI good_name VARCHAR O good_type_na VARCHAR ® type SELECT payment_ id, amount*unit_price AS sum FROM Payments ORDER BY amount*unit_price DESC
8aefdf74fe8fbb244b8f558c075da677
{ "intermediate": 0.3605566918849945, "beginner": 0.37406817078590393, "expert": 0.26537519693374634 }
45,278
Для каждого отдельного платежа выведите идентификатор товара и сумму, потраченную на него, в отсортированном по убыванию этой суммы виде. Список платежей находится в таблице Payments . Для вывода суммы используйте псевдоним sum . Схема базы данных FamilyMembers © Payments © P payment_id INT member_id INTI VARCHAR O family_member INT © good INT O member_name VARCHAR ® amount birthday DATETIME @ unit_price date DATETIME O Goods © GoodTypes © P good_id + P good_type_id INTI good_name VARCHAR O good_type_na VARCHAR ® type Вот мое решение SELECT payment_ id, amount*unit_price AS sum FROM Payments ORDER BY amount*unit_price DESC • Названия столбцов не совпадают с ожидаемыми
61652274b3afcb63e1f9f406720e771b
{ "intermediate": 0.3903711438179016, "beginner": 0.32022544741630554, "expert": 0.28940340876579285 }
45,279
sublime text bind multiple functions on 1 button
63e28f15f22f14904f10345595ed9dcc
{ "intermediate": 0.29696154594421387, "beginner": 0.39049890637397766, "expert": 0.31253960728645325 }
45,280
Gostaria que alterasse esse script para que eu somente copie e cole ele por inteiro, mas que adicione linhas horizontais com escritas no centro dessas linhas encima de cada caixa macro definida, que essas scritas indiquem o horario de início dessa caixa e o fim dela por exemplo : 10:20 - 10:40 estaria escrito em uma linha horizontal acima da caixa: // This Pine Script™ code is subject to the terms of the Mozilla Public License 2.0 at https://mozilla.org/MPL/2.0/ // © dabarlea // © davidescobar.fx //@version=5 indicator("ICT Hydra Macros", "ICT Hydra Macros", true, max_labels_count = 500, max_lines_count = 500, max_boxes_count = 500) // ---------------------------------------- Inputs -------------------------------------------------- var GENERAL_SETTINGS = "General Settings" max_days = input.int(5, "Limit Days to Draw", 1, group = GENERAL_SETTINGS) max_timeframe = input.timeframe("5", "Timeframe Limit", group = GENERAL_SETTINGS) gmt_timezone = input.string('GMT-5', "Timezone", options = ['GMT-12','GMT-11','GMT-10','GMT-9','GMT-8','GMT-7','GMT-6','GMT-5','GMT-4','GMT-3','GMT-2','GMT-1','GMT+0','GMT+1','GMT+2','GMT+3','GMT+4','GMT+5','GMT+6','GMT+7','GMT+8','GMT+9','GMT+10','GMT+11','GMT+12'], group = GENERAL_SETTINGS) var MACROS_SETTINGS = "Macros Settings" show_macros = input.bool(true, "Show Macros Boxes", inline = "Macros", group = MACROS_SETTINGS) show_macros_text = input.bool(true, "Display Text", inline = "Macros", group = MACROS_SETTINGS) macros_transparency = input.int(80, "Macros Transparency", 0, 100, group = MACROS_SETTINGS) use_m1 = input.bool(true, "Macro 1", inline = "Macro1", group = MACROS_SETTINGS) m1 = input.session("0320-0340", "", inline = "Macro1", group = MACROS_SETTINGS) m1_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro1", group = MACROS_SETTINGS) use_m2 = input.bool(true, "Macro 2", inline = "Macro2", group = MACROS_SETTINGS) m2 = input.session("0350-0410", "", inline = "Macro2", group = MACROS_SETTINGS) m2_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro2", group = MACROS_SETTINGS) use_m3 = input.bool(true, "Macro 3", inline = "Macro3", group = MACROS_SETTINGS) m3 = input.session("0420-0440", "", inline = "Macro3", group = MACROS_SETTINGS) m3_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro3", group = MACROS_SETTINGS) use_m4 = input.bool(true, "Macro 4", inline = "Macro4", group = MACROS_SETTINGS) m4 = input.session("0450-0510", "", inline = "Macro4", group = MACROS_SETTINGS) m4_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro4", group = MACROS_SETTINGS) use_m5 = input.bool(true, "Macro 5", inline = "Macro5", group = MACROS_SETTINGS) m5 = input.session("0520-0540", "", inline = "Macro5", group = MACROS_SETTINGS) m5_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro5", group = MACROS_SETTINGS) use_m6 = input.bool(true, "Macro 6", inline = "Macro6", group = MACROS_SETTINGS) m6 = input.session("0550-0610", "", inline = "Macro6", group = MACROS_SETTINGS) m6_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro6", group = MACROS_SETTINGS) use_m7 = input.bool(true, "Macro 7", inline = "Macro7", group = MACROS_SETTINGS) m7 = input.session("0620-0640", "", inline = "Macro7", group = MACROS_SETTINGS) m7_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro7", group = MACROS_SETTINGS) use_m8 = input.bool(true, "Macro 8", inline = "Macro8", group = MACROS_SETTINGS) m8 = input.session("0650-0710", "", inline = "Macro8", group = MACROS_SETTINGS) m8_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro8", group = MACROS_SETTINGS) use_m9 = input.bool(true, "Macro 9", inline = "Macro9", group = MACROS_SETTINGS) m9 = input.session("0720-0740", "", inline = "Macro9", group = MACROS_SETTINGS) m9_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro9", group = MACROS_SETTINGS) use_m10 = input.bool(true, "Macro 10", inline = "Macro10", group = MACROS_SETTINGS) m10 = input.session("0750-0810", "", inline = "Macro10", group = MACROS_SETTINGS) m10_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro10", group = MACROS_SETTINGS) use_m11 = input.bool(true, "Macro 11", inline = "Macro11", group = MACROS_SETTINGS) m11 = input.session("0820-0840", "", inline = "Macro11", group = MACROS_SETTINGS) m11_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro11", group = MACROS_SETTINGS) use_m12 = input.bool(true, "Macro 12", inline = "Macro12", group = MACROS_SETTINGS) m12 = input.session("0850-0910", "", inline = "Macro12", group = MACROS_SETTINGS) m12_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro12", group = MACROS_SETTINGS) use_m13 = input.bool(true, "Macro 13", inline = "Macro13", group = MACROS_SETTINGS) m13 = input.session("0920-0940", "", inline = "Macro13", group = MACROS_SETTINGS) m13_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro13", group = MACROS_SETTINGS) use_m14 = input.bool(true, "Macro 14", inline = "Macro14", group = MACROS_SETTINGS) m14 = input.session("0950-1010", "", inline = "Macro14", group = MACROS_SETTINGS) m14_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro14", group = MACROS_SETTINGS) use_m15 = input.bool(true, "Macro 15", inline = "Macro15", group = MACROS_SETTINGS) m15 = input.session("1020-1040", "", inline = "Macro15", group = MACROS_SETTINGS) m15_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro15", group = MACROS_SETTINGS) use_m16 = input.bool(true, "Macro 16", inline = "Macro16", group = MACROS_SETTINGS) m16 = input.session("1050-1110", "", inline = "Macro16", group = MACROS_SETTINGS) m16_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro16", group = MACROS_SETTINGS) use_m17 = input.bool(true, "Macro 17", inline = "Macro17", group = MACROS_SETTINGS) m17 = input.session("1120-1140", "", inline = "Macro17", group = MACROS_SETTINGS) m17_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro17", group = MACROS_SETTINGS) use_m18 = input.bool(true, "Macro 18", inline = "Macro18", group = MACROS_SETTINGS) m18 = input.session("1150-1210", "", inline = "Macro18", group = MACROS_SETTINGS) m18_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro18", group = MACROS_SETTINGS) use_m19 = input.bool(true, "Macro 19", inline = "Macro19", group = MACROS_SETTINGS) m19 = input.session("1220-1240", "", inline = "Macro19", group = MACROS_SETTINGS) m19_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro19", group = MACROS_SETTINGS) use_m20 = input.bool(true, "Macro 20", inline = "Macro20", group = MACROS_SETTINGS) m20 = input.session("1250-1310", "", inline = "Macro20", group = MACROS_SETTINGS) m20_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro20", group = MACROS_SETTINGS) use_m21 = input.bool(true, "Macro 21", inline = "Macro21", group = MACROS_SETTINGS) m21 = input.session("1320-1340", "", inline = "Macro21", group = MACROS_SETTINGS) m21_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro21", group = MACROS_SETTINGS) use_m22 = input.bool(true, "Macro 22", inline = "Macro22", group = MACROS_SETTINGS) m22 = input.session("1350-1410", "", inline = "Macro22", group = MACROS_SETTINGS) m22_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro22", group = MACROS_SETTINGS) use_m23 = input.bool(true, "Macro 23", inline = "Macro23", group = MACROS_SETTINGS) m23 = input.session("1420-1440", "", inline = "Macro23", group = MACROS_SETTINGS) m23_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro23", group = MACROS_SETTINGS) use_m24 = input.bool(true, "Macro 24", inline = "Macro24", group = MACROS_SETTINGS) m24 = input.session("1450-1510", "", inline = "Macro24", group = MACROS_SETTINGS) m24_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro24", group = MACROS_SETTINGS) use_m25 = input.bool(true, "Macro 25", inline = "Macro25", group = MACROS_SETTINGS) m25 = input.session("1520-1540", "", inline = "Macro25", group = MACROS_SETTINGS) m25_color = input.color(color.rgb(155, 39, 176, 80), "", inline = "Macro25", group = MACROS_SETTINGS) use_m26 = input.bool(true, "Macro 26", inline = "Macro26", group = MACROS_SETTINGS) m26 = input.session("1550-1610", "", inline = "Macro26", group = MACROS_SETTINGS) m26_color = input.color(color.rgb(33, 149, 243, 80), "", inline = "Macro26", group = MACROS_SETTINGS) // ---------------------------------------- Inputs -------------------------------------------------- // ---------------------------------------- Variables & Constants -------------------------------------------------- m1_time = not na(time("", m1, gmt_timezone)) m2_time = not na(time("", m2, gmt_timezone)) m3_time = not na(time("", m3, gmt_timezone)) m4_time = not na(time("", m4, gmt_timezone)) m5_time = not na(time("", m5, gmt_timezone)) m6_time = not na(time("", m6, gmt_timezone)) m7_time = not na(time("", m7, gmt_timezone)) m8_time = not na(time("", m8, gmt_timezone)) m9_time = not na(time("", m9, gmt_timezone)) m10_time = not na(time("", m10, gmt_timezone)) m11_time = not na(time("", m11, gmt_timezone)) m12_time = not na(time("", m12, gmt_timezone)) m13_time = not na(time("", m13, gmt_timezone)) m14_time = not na(time("", m14, gmt_timezone)) m15_time = not na(time("", m15, gmt_timezone)) m16_time = not na(time("", m16, gmt_timezone)) m17_time = not na(time("", m17, gmt_timezone)) m18_time = not na(time("", m18, gmt_timezone)) m19_time = not na(time("", m19, gmt_timezone)) m20_time = not na(time("", m20, gmt_timezone)) m21_time = not na(time("", m21, gmt_timezone)) m22_time = not na(time("", m22, gmt_timezone)) m23_time = not na(time("", m23, gmt_timezone)) m24_time = not na(time("", m24, gmt_timezone)) m25_time = not na(time("", m25, gmt_timezone)) m26_time = not na(time("", m26, gmt_timezone)) var m1_box = array.new_box() var m2_box = array.new_box() var m3_box = array.new_box() var m4_box = array.new_box() var m5_box = array.new_box() var m6_box = array.new_box() var m7_box = array.new_box() var m8_box = array.new_box() var m9_box = array.new_box() var m10_box = array.new_box() var m11_box = array.new_box() var m12_box = array.new_box() var m13_box = array.new_box() var m14_box = array.new_box() var m15_box = array.new_box() var m16_box = array.new_box() var m17_box = array.new_box() var m18_box = array.new_box() var m19_box = array.new_box() var m20_box = array.new_box() var m21_box = array.new_box() var m22_box = array.new_box() var m23_box = array.new_box() var m24_box = array.new_box() var m25_box = array.new_box() var m26_box = array.new_box() m1_text = "" m2_text = "" m3_text = "" m4_text = "" m5_text = "" m6_text = "" m7_text = "" m8_text = "" m9_text = "" m10_text = "" m11_text = "" m12_text = "" m13_text = "" m14_text = "" m15_text = "" m16_text = "" m17_text = "" m18_text = "" m19_text = "" m20_text = "" m21_text = "" m22_text = "" m23_text = "" m24_text = "" m25_text = "" m26_text = "" m1_color := color.new(m1_color, macros_transparency) m2_color := color.new(m2_color, macros_transparency) m3_color := color.new(m3_color, macros_transparency) m4_color := color.new(m4_color, macros_transparency) m5_color := color.new(m5_color, macros_transparency) m6_color := color.new(m6_color, macros_transparency) m7_color := color.new(m7_color, macros_transparency) m8_color := color.new(m8_color, macros_transparency) m9_color := color.new(m9_color, macros_transparency) m10_color := color.new(m10_color, macros_transparency) m11_color := color.new(m11_color, macros_transparency) m12_color := color.new(m12_color, macros_transparency) m13_color := color.new(m13_color, macros_transparency) m14_color := color.new(m14_color, macros_transparency) m15_color := color.new(m15_color, macros_transparency) m16_color := color.new(m16_color, macros_transparency) m17_color := color.new(m17_color, macros_transparency) m18_color := color.new(m18_color, macros_transparency) m19_color := color.new(m19_color, macros_transparency) m20_color := color.new(m20_color, macros_transparency) m21_color := color.new(m21_color, macros_transparency) m22_color := color.new(m22_color, macros_transparency) m23_color := color.new(m23_color, macros_transparency) m24_color := color.new(m24_color, macros_transparency) m25_color := color.new(m25_color, macros_transparency) m26_color := color.new(m26_color, macros_transparency) // ---------------------------------------- Variables & Constants -------------------------------------------------- // ---------------------------------------- Functions -------------------------------------------------- get_time_text(_session) => startHour = str.substring(_session, 0, 2) startMin = str.substring(_session, 2, 4) endHour = str.substring(_session, 5, 7) endMin = str.substring(_session, 7, 9) startHour + ":" + startMin + "\n" + endHour + ":" + endMin m1_text := get_time_text(m1) m2_text := get_time_text(m2) m3_text := get_time_text(m3) m4_text := get_time_text(m4) m5_text := get_time_text(m5) m6_text := get_time_text(m6) m7_text := get_time_text(m7) m8_text := get_time_text(m8) m9_text := get_time_text(m9) m10_text := get_time_text(m10) m11_text := get_time_text(m11) m12_text := get_time_text(m12) m13_text := get_time_text(m13) m14_text := get_time_text(m14) m15_text := get_time_text(m15) m16_text := get_time_text(m16) m17_text := get_time_text(m17) m18_text := get_time_text(m18) m19_text := get_time_text(m19) m20_text := get_time_text(m20) m21_text := get_time_text(m21) m22_text := get_time_text(m22) m23_text := get_time_text(m23) m24_text := get_time_text(m24) m25_text := get_time_text(m25) m26_text := get_time_text(m26) adjust(_box) => _box.set_right(bar_index) _top = _box.get_top() _bot = _box.get_bottom() if high > _top _box.set_top(high) if low < _bot _box.set_bottom(low) check_array(_arr) => if _arr.size() > max_days _arr.pop().delete() // ---------------------------------------- Functions -------------------------------------------------- // ---------------------------------------- Core Logic -------------------------------------------------- if timeframe.in_seconds("") <= timeframe.in_seconds(max_timeframe) // Macro 1 if use_m1 if m1_time and not m1_time[1] if show_macros m1_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m1_color, bgcolor = m1_color, text = show_macros_text ? m1_text : na, text_color = m1_color)) else if m1_time adjust(m1_box.get(0)) // Macro 2 if use_m2 if m2_time and not m2_time[1] if show_macros m2_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m2_color, bgcolor = m2_color, text = show_macros_text ? m2_text : na, text_color = m2_color)) else if m2_time adjust(m2_box.get(0)) // Macro 3 if use_m3 if m3_time and not m3_time[1] if show_macros m3_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m3_color, bgcolor = m3_color, text = show_macros_text ? m3_text : na, text_color = m3_color)) else if m3_time adjust(m3_box.get(0)) // Macro 4 if use_m4 if m4_time and not m4_time[1] if show_macros m4_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m4_color, bgcolor = m4_color, text = show_macros_text ? m4_text : na, text_color = m4_color)) else if m4_time adjust(m4_box.get(0)) // Macro 5 if use_m5 if m5_time and not m5_time[1] if show_macros m5_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m5_color, bgcolor = m5_color, text = show_macros_text ? m5_text : na, text_color = m5_color)) else if m5_time adjust(m5_box.get(0)) // Macro 6 if use_m6 if m6_time and not m6_time[1] if show_macros m6_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m6_color, bgcolor = m6_color, text = show_macros_text ? m6_text : na, text_color = m6_color)) else if m6_time adjust(m6_box.get(0)) // Macro 7 if use_m7 if m7_time and not m7_time[1] if show_macros m7_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m7_color, bgcolor = m7_color, text = show_macros_text ? m7_text : na, text_color = m7_color)) else if m7_time adjust(m7_box.get(0)) // Macro 8 if use_m8 if m8_time and not m8_time[1] if show_macros m8_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m8_color, bgcolor = m8_color, text = show_macros_text ? m8_text : na, text_color = m8_color)) else if m8_time adjust(m8_box.get(0)) // Macro 9 if use_m9 if m9_time and not m9_time[1] if show_macros m9_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m9_color, bgcolor = m9_color, text = show_macros_text ? m9_text : na, text_color = m9_color)) else if m9_time adjust(m9_box.get(0)) // Macro 10 if use_m10 if m10_time and not m10_time[1] if show_macros m10_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m10_color, bgcolor = m10_color, text = show_macros_text ? m10_text : na, text_color = m10_color)) else if m10_time adjust(m10_box.get(0)) // Macro 11 if use_m11 if m11_time and not m11_time[1] if show_macros m11_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m11_color, bgcolor = m11_color, text = show_macros_text ? m11_text : na, text_color = m11_color)) else if m11_time adjust(m11_box.get(0)) // Macro 12 if use_m12 if m12_time and not m12_time[1] if show_macros m12_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m12_color, bgcolor = m12_color, text = show_macros_text ? m12_text : na, text_color = m12_color)) else if m12_time adjust(m12_box.get(0)) // Macro 13 if use_m13 if m13_time and not m13_time[1] if show_macros m13_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m13_color, bgcolor = m13_color, text = show_macros_text ? m13_text : na, text_color = m13_color)) else if m13_time adjust(m13_box.get(0)) // Macro 14 if use_m14 if m14_time and not m14_time[1] if show_macros m14_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m14_color, bgcolor = m14_color, text = show_macros_text ? m14_text : na, text_color = m14_color)) else if m14_time adjust(m14_box.get(0)) // Macro 15 if use_m15 if m15_time and not m15_time[1] if show_macros m15_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m15_color, bgcolor = m15_color, text = show_macros_text ? m15_text : na, text_color = m15_color)) else if m15_time adjust(m15_box.get(0)) // Macro 16 if use_m16 if m16_time and not m16_time[1] if show_macros m16_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m16_color, bgcolor = m16_color, text = show_macros_text ? m16_text : na, text_color = m16_color)) else if m16_time adjust(m16_box.get(0)) // Macro 17 if use_m17 if m17_time and not m17_time[1] if show_macros m17_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m17_color, bgcolor = m17_color, text = show_macros_text ? m17_text : na, text_color = m17_color)) else if m17_time adjust(m17_box.get(0)) // Macro 18 if use_m18 if m18_time and not m18_time[1] if show_macros m18_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m18_color, bgcolor = m18_color, text = show_macros_text ? m18_text : na, text_color = m18_color)) else if m18_time adjust(m18_box.get(0)) // Macro 19 if use_m19 if m19_time and not m19_time[1] if show_macros m19_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m19_color, bgcolor = m19_color, text = show_macros_text ? m19_text : na, text_color = m19_color)) else if m19_time adjust(m19_box.get(0)) // Macro 20 if use_m20 if m20_time and not m20_time[1] if show_macros m20_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m20_color, bgcolor = m20_color, text = show_macros_text ? m20_text : na, text_color = m20_color)) else if m20_time adjust(m20_box.get(0)) // Macro 21 if use_m21 if m21_time and not m21_time[1] if show_macros m21_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m21_color, bgcolor = m21_color, text = show_macros_text ? m21_text : na, text_color = m21_color)) else if m21_time adjust(m21_box.get(0)) // Macro 22 if use_m22 if m22_time and not m22_time[1] if show_macros m22_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m22_color, bgcolor = m22_color, text = show_macros_text ? m22_text : na, text_color = m22_color)) else if m22_time adjust(m22_box.get(0)) // Macro 23 if use_m23 if m23_time and not m23_time[1] if show_macros m23_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m23_color, bgcolor = m23_color, text = show_macros_text ? m23_text : na, text_color = m23_color)) else if m23_time adjust(m23_box.get(0)) // Macro 24 if use_m24 if m24_time and not m24_time[1] if show_macros m24_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m24_color, bgcolor = m24_color, text = show_macros_text ? m24_text : na, text_color = m24_color)) else if m24_time adjust(m24_box.get(0)) // Macro 25 if use_m25 if m25_time and not m25_time[1] if show_macros m25_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m25_color, bgcolor = m25_color, text = show_macros_text ? m25_text : na, text_color = m25_color)) else if m25_time adjust(m25_box.get(0)) // Macro 26 if use_m26 if m26_time and not m26_time[1] if show_macros m26_box.unshift(box.new(bar_index, high, bar_index, low, border_color = m26_color, bgcolor = m26_color, text = show_macros_text ? m26_text : na, text_color = m26_color)) else if m26_time adjust(m26_box.get(0)) check_array(m1_box) check_array(m2_box) check_array(m3_box) check_array(m4_box) check_array(m5_box) check_array(m6_box) check_array(m7_box) check_array(m8_box) check_array(m9_box) check_array(m10_box) check_array(m11_box) check_array(m12_box) check_array(m13_box) check_array(m14_box) check_array(m15_box) check_array(m16_box) check_array(m17_box) check_array(m18_box) check_array(m19_box) check_array(m20_box) check_array(m21_box) check_array(m22_box) check_array(m23_box) check_array(m24_box) check_array(m25_box) check_array(m26_box) // ---------------------------------------- Core Logic --------------------------------------------------
a50deca3ad8c750411ff3014ec715818
{ "intermediate": 0.31118226051330566, "beginner": 0.507112443447113, "expert": 0.1817053258419037 }