加勒比久久综合,国产精品伦一区二区,66精品视频在线观看,一区二区电影

合肥生活安徽新聞合肥交通合肥房產(chǎn)生活服務(wù)合肥教育合肥招聘合肥旅游文化藝術(shù)合肥美食合肥地圖合肥社保合肥醫(yī)院企業(yè)服務(wù)合肥法律

DDA3020代做、代寫Python語(yǔ)言編程
DDA3020代做、代寫Python語(yǔ)言編程

時(shí)間:2024-10-12  來源:合肥網(wǎng)hfw.cc  作者:hfw.cc 我要糾錯(cuò)



DDA3020 Homework 1
Due date: Oct 14, 2024
Instructions
• The deadline is 23:59, Oct 14, 2024.
• The weight of this assignment in the ffnal grade is 20%.
• Electronic submission: Turn in solutions electronically via Blackboard. Be sure to submit
 your homework as one pdf ffle plus two python scripts. Please name your solution ffles as
”DDA3020HW1 studentID name.pdf”, ”HW1 yourID Q1.ipynb” and ”HW1 yourID Q2.ipynb”.
(.py ffles also acceptable)
• Note that late submissions will result in discounted scores: 0-24 hours → 80%, 24-120 hours
→ 50%, 120 or more hours → 0%.
• Answer the questions in English. Otherwise, you’ll lose half of the points.
• Collaboration policy: You need to solve all questions independently and collaboration between
students is NOT allowed.
1 Written Problems (50 points)
1.1. (Learning of Linear Regression, 25 points) Suppose we have training data:
{(x1, y1),(x2, y2), . . . ,(xN , yN )},
where xi ∈ R
d and yi ∈ R
k
, i = 1, 2, . . . , N.
i) (9 pts) Find the closed-form solution of the following problem.
min
W,b
X
N
i=1
∥yi − Wxi − b∥
2
2
,
ii) (8 pts) Show how to use gradient descent to solve the problem. (Please state at least one
possible Stopping Criterion)
1DDA3020 Machine Learning Autumn 2024, CUHKSZ
iii) (8 pts) We further suppose that x1, x2, . . . , xN are drawn from N (µ, σ
2
). Show that the
maximum likelihood estimation (MLE) of σ
2
is σˆ
2
MLE =
1
N
PN
n=1
(xn − µMLE)
2
.
1.2. (Support Vector Machine, 25 points) Given two positive samples x1 = (3, 3)
T
, x2 =
(4, 3)
T
, and one negative sample x3 = (1, 1)
T
, ffnd the maximum-margin separating hyperplane and
support vectors.
Solution steps:
i) Formulating the Optimization Problem (5 pts)
ii) Constructing the Lagrangian (5 pts)
iii) Using KKT Conditions (5 pts)
iv) Solving the Equations (5 pts)
v) Determining the Hyperplane Equation and Support Vectors (5 pts)
2 Programming (50 points)
2.1. (Linear regression, 25 points) We have a labeled dataset D = {(x1, y1),(x2, y2),
· · · ,(xn, yn)}, with xi ∈ R
d being the d-dimensional feature vector of the i-th sample, and yi ∈ R
being real valued target (label).
A linear regression model is give by
fw0,...,wd
(x) = w0 + w1x1 + w2x2 + · · · + wdxd, (1)
where w0 is often called bias and w1, w2, . . . , wd are often called coefffcients.
Now, we want to utilize the dataset D to build a linear model based on linear regression.
We provide a training set Dtrain that includes 2024 labeled samples with 11 features (See linear
 regression train.txt) to fft model, and a test set Dtest that includes 10 unlabeled samples with
11 features (see linear regression test.txt) to estimate model.
1. Using the LinearRegression class from Sklearn package to get the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest by the model trained well. (Put
the estimation of w0, w1, . . . , w11 and these yˆ in your answers.)
2. Implementing the linear regression by yourself to obtain the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest. (Put the estimation of
w0, w1, . . . , w11 and these yˆ in your answers. It is allowed to compute the inverse of a matrix
using the existing python package.)
2DDA3020 Machine Learning Autumn 2024, CUHKSZ
(Hint: Note that for linear regression train.txt, there are 2024 rows with 12 columns where the
ffrst 11 columns are features x and the last column is target y and linear regression test.txt
only contains 10 rows with 11 columns (features). Both of two tasks require the submission of
code and results. Put all the code in a “HW1 yourID Q1.ipynb” Jupyter notebook. ffle.(”.py”
ffle is also acceptable))
2.2. (SVM, 25 points)
Task Description You are asked to write a program that constructs support vector machine
models with different kernel functions and slack variables.
Datasets You are provided with the iris dataset. The data set contains 3 classes of 50 instances
each, where each class refers to a type of iris plant. There are four features: 1. sepal length in cm;
2. sepal width in cm; 3. petal length in cm; 4. petal width in cm. You need to use these features
to classify each iris plant as one of the three possible types.
What you should do You should use the SVM function from python sklearn package, which
provides various forms of SVM functions. For multiclass SVM you should use the one vs rest
strategy. You are recommended to use sklearn.svm.svc() function. You can use numpy for vector
manipulation. For technical report, you should report the results required as mentioned below (e.g.
training error, testing error, and so on).
1. (2 points) Split training set and test set. Split the data into a training set and a test set.
The training set should contain 70% of the samples, while the test set should include 30%.
The number of samples from each category in both the training and test sets should reffect
this 70-30 split; for each category, the ffrst 70% of the samples will form the training set, and
the remaining 30% will form the test set. Ensure that the split maintains the original order
of the data. You should report instance ids in the split training set and test set. The output
format is as follows:
Q2.2.1 Split training set and test set:
Training set: xx
Test set: xx
You should ffll up xx in the template. You should write ids for each set in the same line with
comma separated, e.g. Training set:[1, 4, 19].
2. (10 points) Calculation using Standard SVM Model (Linear Kernel). Employ the
standard SVM model with a linear kernel. Train your SVM on the split training dataset and
3DDA3020 Machine Learning Autumn 2024, CUHKSZ
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, output the weight vector w, the bias b, and the indices of support vectors
(start with 0). Note that the scikit-learn package does not offer a function with hard margin,
so we will simulate this using C = 1e5. You should ffrst print out the total training error
and testing error, where the error is
wrong prediction
number of data
. Then, print out the results for each class
separately (note that you should calculate errors for each class separately in this part). You
should also mention in your report which classes are linear separable with SVM without slack.
The output format is as follows:
Q2.2.2 Calculation using Standard SVM Model:
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
Linear separable classes: xx
If we view the one vs all strategy as combining the multiple different SVM, each one being
a separating hyperplane for one class and the rest of the points, then the w, b and support
vector indices for that class is the corresponding parameters for the SVM separating this class
and the rest of the points. If a variable is of vector form, say a =


1
2
3
?**4;
?**5;?**5;?**6;, then you should write
each entry in the same line with comma separated e.g. [1,2,3].
3. (6 points) Calculation using SVM with Slack Variables (Linear Kernel). For each
C = 0.25 × t, where t = 1, 2, . . . , 4, train your SVM on the training dataset, and subsequently
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, the weight vector w, the bias b, and the indices of support vectors, and the
slack variable ζ of support vectors (you may compute it as max(0, 1 − y · f(X)). The output
format is as follows:
Q2.2.3 Calculation using SVM with Slack Variables (C = 0.25 × t, where t = 1, . . . , 4):
4DDA3020 Machine Learning Autumn 2024, CUHKSZ
-------------------------------------------
C=0.25,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
-------------------------------------------
C=0.5,
<... results for (C=0.5) ...>
-------------------------------------------
C=0.75,
<... results for (C=0.75) ...>
-------------------------------------------
C=1,
<... results for (C=1) ...>
4. (7 points) Calculation using SVM with Kernel Functions. Conduct experiments with
different kernel functions for SVM without slack variable. Calculate the classiffcation error
for both the training and testing datasets, and the indices of support vectors for each kernel
type:
(a) 2nd-order Polynomial Kernel
(b) 3nd-order Polynomial Kernel
(c) Radial Basis Function Kernel with σ = 1
(d) Sigmoidal Kernel with σ = 1
The output format is as follows:
5DDA3020 Machine Learning Autumn 2024, CUHKSZ
Q2.2.4 Calculation using SVM with Kernel Functions:
-------------------------------------------
(a) 2nd-order Polynomial Kernel,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
-------------------------------------------
(b) 3nd-order Polynomial Kernel,
<... results for (b) ...>
-------------------------------------------
(c) Radial Basis Function Kernel with σ = 1,
<... results for (c) ...>
-------------------------------------------
(d) Sigmoidal Kernel with σ = 1,
<... results for (d) ...>
Submission Submit your executable code in a “HW1 yourID Q2.ipynb” Jupyter notebook(”.py”
file is also acceptable). Indicate the corresponding question number in the comment for each cell,
and ensure that your code can logically produce the required results for each question in the required
format. Please note that you need to write clear comments and use appropriate function/variable
names. Excessively unreadable code may result in point deductions.

6

請(qǐng)加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp




 

掃一掃在手機(jī)打開當(dāng)前頁(yè)
  • 上一篇:代做CS 259、Java/c++設(shè)計(jì)程序代寫
  • 下一篇:代做MSE 280、代寫Matlab程序語(yǔ)言
  • 無相關(guān)信息
    合肥生活資訊

    合肥圖文信息
    2025年10月份更新拼多多改銷助手小象助手多多出評(píng)軟件
    2025年10月份更新拼多多改銷助手小象助手多
    有限元分析 CAE仿真分析服務(wù)-企業(yè)/產(chǎn)品研發(fā)/客戶要求/設(shè)計(jì)優(yōu)化
    有限元分析 CAE仿真分析服務(wù)-企業(yè)/產(chǎn)品研發(fā)
    急尋熱仿真分析?代做熱仿真服務(wù)+熱設(shè)計(jì)優(yōu)化
    急尋熱仿真分析?代做熱仿真服務(wù)+熱設(shè)計(jì)優(yōu)化
    出評(píng) 開團(tuán)工具
    出評(píng) 開團(tuán)工具
    挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
    挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
    海信羅馬假日洗衣機(jī)亮相AWE  復(fù)古美學(xué)與現(xiàn)代科技完美結(jié)合
    海信羅馬假日洗衣機(jī)亮相AWE 復(fù)古美學(xué)與現(xiàn)代
    合肥機(jī)場(chǎng)巴士4號(hào)線
    合肥機(jī)場(chǎng)巴士4號(hào)線
    合肥機(jī)場(chǎng)巴士3號(hào)線
    合肥機(jī)場(chǎng)巴士3號(hào)線
  • 短信驗(yàn)證碼 目錄網(wǎng) 排行網(wǎng)

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網(wǎng) 版權(quán)所有
    ICP備06013414號(hào)-3 公安備 42010502001045

    日韩精品免费一区二区夜夜嗨| 日本一区二区三区视频| 中文精品久久| 免播放器亚洲一区| 国产精品成人自拍| 美女精品一区二区| 97精品一区| 欧美精选一区二区三区| 亚洲欧洲av| 日本在线观看不卡视频| 成人小电影网站| 午夜日本精品| 精品国产91| 亚洲精品中文字幕99999| 欧美久久久网站| 91综合在线| 亚洲视频大全| 国产一区二区三区四区三区四| 日韩成人18| 亚洲欧美在线专区| 日韩成人在线电影| 波多视频一区| 免费高清不卡av| 在线综合亚洲| 婷婷综合伊人| 青青草91久久久久久久久| 久久不见久久见中文字幕免费| 麻豆一区二区三区| 国产第一亚洲| 交100部在线观看| 人人超碰91尤物精品国产| 亚洲电影在线一区二区三区| 久久美女性网| 青青草91久久久久久久久| 日韩亚洲精品在线观看| 国产探花在线精品一区二区| 综合日韩在线| 日韩国产精品大片| 久久精品一区二区三区中文字幕| 日韩精品影院| 日韩免费特黄一二三区| 国产精品久久观看| 亚洲一区一卡| 中文在线一区| 午夜在线视频观看日韩17c| 欧美精品九九| 国产视频一区欧美| 夜夜嗨网站十八久久| 99国产成+人+综合+亚洲欧美| 一区三区在线欧| 影视先锋久久| 九一精品国产| 午夜欧美精品| 亚洲在线日韩| 狂野欧美性猛交xxxx巴西| 丝袜a∨在线一区二区三区不卡| 亚洲黄网站黄| 9色精品在线| 久久av最新网址| 久久亚洲不卡| av最新在线| 国产一区二区主播在线| 青青在线精品| 久久精品女人天堂| 影音先锋日韩在线| 精品国产亚洲一区二区在线观看 | 欧美女王vk| 亚洲美女15p| 在线综合色站| 国内精品视频在线观看| 女主播福利一区| 久久大逼视频| 欧美日韩国产v| 国产日韩欧美在线播放不卡| 久久资源在线| 欧美激情三级| 极品束缚调教一区二区网站| 香蕉一区二区| 久久亚洲色图| 精品视频一区二区三区四区五区| 欧美亚洲在线| 国产精品一区二区99| avtt综合网| 女厕嘘嘘一区二区在线播放| 奶水喷射视频一区| 日本一区免费网站| 综合激情视频| 91在线一区| 在线看片不卡| 亚洲欧美小说色综合小说一区| 欧美久久久网站| 亚洲精品456| 99精品美女| 蜜臀av一级做a爰片久久| 日韩欧美一区二区三区在线视频| 久久国产视频网| 日韩av网址大全| 国产一区日韩欧美| 最近高清中文在线字幕在线观看1| 日韩深夜福利网站| 色棕色天天综合网| 久久精品国产www456c0m| 久久xxxx| 一区二区黄色| 伊人久久亚洲| 波多野结衣在线播放一区| 中文在线资源| 欧美日韩91| 久久狠狠婷婷| 极品在线视频| 国产成人视屏| 国产99精品一区| 日韩在线观看一区| 国产成人高清| 在线精品国产| 精品久久在线| 极品束缚调教一区二区网站| 亚洲免费网址| 在线观看视频日韩| 精品美女久久| 日韩国产欧美| 国产永久精品大片wwwapp| 亚洲午夜黄色| 国产69精品久久| 中文无码日韩欧| 蜜臀av在线播放一区二区三区| 亚洲精品免费观看| 91精品国产91久久久久久黑人| 天堂а√在线最新版中文在线| 亚洲影视一区| 欧美粗暴jizz性欧美20| 欧美天堂在线| 亚洲综合影院| 中国字幕a在线看韩国电影| 国产一区二区三区亚洲综合| 怡红院精品视频在线观看极品| 亚洲精品成a人ⅴ香蕉片| 99a精品视频在线观看| 一区二区乱码| 日本最新不卡在线| 久久久久久网| 日韩三区视频| 日韩av片子| 日韩黄色av| 超碰在线99| 精品中文字幕一区二区三区| 狂野欧美性猛交xxxx巴西| 国产欧美日韩在线观看视频| 国产美女诱惑一区二区| 欧美日韩伊人| 99国产精品99久久久久久粉嫩| 日本午夜一区二区| 亚洲精品中文字幕乱码| 日韩精品午夜视频| 精品一区亚洲| 99综合久久| 男人的天堂亚洲一区| 久久视频免费| 伊人久久国产| 成人中文在线| 78精品国产综合久久香蕉| 特黄特色欧美大片| 日韩国产欧美在线观看| 亚洲精品一二三区区别| av在线播放一区二区| 老司机精品视频网站| 日韩免费电影在线观看| 婷婷综合六月| 欧美午夜精彩| 国产精品视频一区视频二区| 免费观看30秒视频久久| 国产精品视频3p| 免费亚洲婷婷| 久久国产精品99国产| 日韩第一区第二区| 久久精品国产精品亚洲红杏| 亚洲大全视频| 日韩极品少妇| 本网站久久精品| 亚洲激情欧美| 婷婷亚洲成人| 亚洲成人精品综合在线| 欧美另类综合| eeuss国产一区二区三区四区| yy6080久久伦理一区二区| 99人久久精品视频最新地址| 欧美男gay| 国产成人精品一区二区三区在线| 欧美日韩国产探花| 日韩电影在线一区| 免费在线成人| 国产99在线| 亚洲国产一区二区在线观看| 国产永久精品大片wwwapp| 日韩制服一区| 亚洲自啪免费| 91精品国产91久久久久久密臀| 国产精品久一| 国产成人久久精品麻豆二区| 亚洲在线观看|