非线性最小二乘问题的数值方法 —— 狗腿法 Powell‘s Dog Leg Method (II, Python 简单实例)

Title: 非线性最小二乘问题的数值方法 —— 狗腿法 Powell‘s Dog Leg Method (II, Python 简单实例)


姊妹博文

非线性最小二乘问题的数值方法 —— 狗腿法 Powell‘s Dog Leg Method (I - 原理与算法)


0.前言

本篇博文作为对前述 “非线性最小二乘问题的数值方法 —— 狗腿法 Powell‘s Dog Leg Method (I - 原理与算法)” 的简单实践扩展.

理论部分参见前述博文, 此处不再重复. 这里只是补充一个简单的 Python 实例.


1. 最优问题实例

m i n i m i z e g ( x ) = 1 2 ∥ r ( x ) ∥ 2 2 = 1 2 ∑ i = 1 3 r i ( x ) 2 (I-1) {\rm minimize}\quad {g}(\mathbf{x}) = \frac{1}{2}\|\mathbf{r}(\mathbf{x})\|_2^2 = \frac{1}{2}\sum_{i=1}^{3} r_i(\mathbf{x})^2 \tag{I-1} minimizeg(x)=21r(x)22=21i=13ri(x)2(I-1)

其中

x = [ x 1 , x 2 ] T \mathbf{x} = \begin{bmatrix} x_1, x_2 \end{bmatrix}^{\small\rm T} x=[x1,x2]T

r ( x ) = [ r 1 ( x ) , r 2 ( x ) , r 3 ( x ) ] T \mathbf{r}(\mathbf{x}) = \begin{bmatrix} r_1(\mathbf{x}), \, r_2(\mathbf{x}) ,\,r_3(\mathbf{x}) \end{bmatrix}^{\small\rm T} r(x)=[r1(x),r2(x),r3(x)]T

r 1 ( x ) = sin ⁡ x 1 − 0.4 r_1(\mathbf{x}) = \sin x_1 -0.4 r1(x)=sinx10.4

r 2 ( x ) = cos ⁡ x 2 + 0.8 r_2(\mathbf{x}) = \cos x_2 + 0.8 r2(x)=cosx2+0.8

r 3 ( x ) = x 1 2 + x 2 2 − 1 r_3(\mathbf{x}) = \sqrt{x_1^2 +x_2^2} -1 r3(x)=x12+x22 1

可以推得

∂ r ( x ) ∂ x = [ cos ⁡ x 1 0 0 − sin ⁡ x 2 x 1 x 1 2 + x 2 2 x 2 x 1 2 + x 2 2 ] \frac{\partial \mathbf{r}(\mathbf{x})}{\partial \mathbf{x}} = \begin{bmatrix}\cos x_1 & 0\\ 0 &-\sin x_2 \\ \frac{x_1}{\sqrt{x_1^2+x_2^2}} & \frac{x_2}{\sqrt{x_1^2+x_2^2}} \end{bmatrix} xr(x)= cosx10x12+x22 x10sinx2x12+x22 x2

g ( x ) = 1 2 [ ( sin ⁡ x 1 − 0.4 ) 2 + ( cos ⁡ x 2 + 0.8 ) 2 + ( x 2 2 + x 1 2 − 1 ) 2 ] g(\mathbf{x})=\frac{1}{2} \left[{ {{\left( \sin{ {x_1} }-0.4\right) }^{2}}+{{\left( \cos{ {x_2} }+0.8\right) }^{2}}+{{\left( \sqrt{{{{x_2}}^{2}}+{{{x_1}}^{2}}}-1\right) }^{2}}}\right] g(x)=21[(sinx10.4)2+(cosx2+0.8)2+(x22+x12 1)2]

∇ g ( x ) = [ x 1 ( x 2 2 + x 1 2 − 1 ) x 2 2 + x 1 2 + cos ⁡ x 1 ( sin ⁡ x 1 − 0.4 ) x 2 ( x 2 2 + x 1 2 − 1 ) x 2 2 + x 1 2 − sin ⁡ x 2 ( cos ⁡ x 2 + 0.8 ) ] \nabla g(\mathbf{x}) = \begin{bmatrix}\frac{{x_1} \left( \sqrt{{{{x_2}}^{2}}+{{{x_1}}^{2}}}-1\right) }{\sqrt{{{{x_2}}^{2}}+{{{x_1}}^{2}}}}+\cos{ {x_1} } \left( \sin{ {x_1} }-0.4\right) \\ \frac{{x_2} \left( \sqrt{{{{x_2}}^{2}}+{{{x_1}}^{2}}}-1\right) }{\sqrt{{{{x_2}}^{2}}+{{{x_1}}^{2}}}}- \sin{ {x_2} } \left( \cos{ {x_2} }+0.8\right) \end{bmatrix} g(x)= x22+x12 x1(x22+x12 1)+cosx1(sinx10.4)x22+x12 x2(x22+x12 1)sinx2(cosx2+0.8)

H ~ ( x ) = [ x 1 2 x 2 2 + x 1 2 + ( cos ⁡ x 1 ) 2 x 1 x 2 x 2 2 + x 1 2 x 1 x 2 x 2 2 + x 1 2 ( sin ⁡ x 2 ) 2 + x 2 2 x 2 2 + x 1 2 ] \widetilde{\mathbf{H}}(\mathbf{x})=\begin{bmatrix}\frac{{{{x_1}}^{2}}}{{{{x_2}}^{2}}+{{{x_1}}^{2}}}+{{(\cos{ {x_1} })}^{2}} & \frac{{x_1} {x_2}}{{{{x_2}}^{2}}+{{{x_1}}^{2}}}\\ \frac{{x_1} {x_2}}{{{{x_2}}^{2}}+{{{x_1}}^{2}}} & {{(\sin{ {x_2} )}}^{2}}+\frac{{{{x_2}}^{2}}}{{{{x_2}}^{2}}+{{{x_1}}^{2}}}\end{bmatrix} H (x)=[x22+x12x12+(cosx1)2x22+x12x1x2x22+x12x1x2(sinx2)2+x22+x12x22]

具体的符号推导参见非线性最小二乘问题的数值方法 —— 从牛顿迭代法到高斯-牛顿法 (实例篇 V).


2. 狗腿法 (Powell‘s Dog Leg Method) Python 实现

基于狗腿法的算法流程实现如下简单 Python Demo:

from mpl_toolkits.mplot3d import axes3d
import matplotlib.pyplot as plt
import numpy as np
from numpy.linalg import inv, det, norm
from math import cos
from math import sin
from math import sqrt
from math import pow# multiplication of two matrixs
def multiply_matrix(A, B):if  A.shape[1] == B.shape[0]:C = np.zeros((A.shape[0], B.shape[1]), dtype = float)[rows, cols] = C.shapefor row in range(rows): for col in range(cols):for elt in range(len(B)):C[row, col] += A[row, elt] * B[elt, col]return Celse:return "Cannot multiply A and B. Please check whether the dimensions of the inputs are compatible."# g(x) = (1/2) ||r(x)||_2^2
def g(x_vector):x_1 = x_vector[0]x_2 = x_vector[1]return ( pow(sin(x_1)-0.4, 2)+ pow(cos(x_2)+0.8, 2) + pow(sqrt(pow(x_2,2)+pow(x_1,2))-1, 2) ) /2# r(x) = [r_1, r_2, r_3]^{T}
def r(x_vector):x_1 = x_vector[0]x_2 = x_vector[1]return np.array([[sin(x_1)-0.4],[cos(x_2)+0.8],[sqrt(pow(x_1,2)+pow(x_2,2))-1]], dtype=object)# \partial r(x) / \partial x
def dr(x_vector):x_1 = x_vector[0]x_2 = x_vector[1]if sqrt(pow(x_2,2)+pow(x_1,2)) < 1e-3:  ## 人为设置return np.array([[cos(x_1),	0], [0, -sin(x_2)],[0, 0]], dtype=object)else:return np.array([[cos(x_1),	0],[0,	-sin(x_2)],[x_1/sqrt(pow(x_2,2)+pow(x_1,2)), x_2/sqrt(pow(x_2,2)+pow(x_1,2))]], dtype=object)# Simplified Hessian matrix in Gauss-Newton method
# refer to eq. ​(I-1-2) in blog "非线性最小二乘问题的数值方法 —— 从高斯-牛顿法到列文伯格-马夸尔特法 (I)"
def sH(x_vector):x_1 = x_vector[0]x_2 = x_vector[1]return multiply_matrix(np.transpose(dr(x_1, x_2)), dr(x_1, x_2)) # \nabla g(x_1, x_2)
# refer to eq. ​(I-1-3) in blog "非线性最小二乘问题的数值方法 —— 从高斯-牛顿法到列文伯格-马夸尔特法 (I)"
def dg(x_vector):x_1 = x_vector[0]x_2 = x_vector[1]return np.array(multiply_matrix(np.transpose(dr(x_1, x_2)), r(x_1, x_2)))# model for the cost function g based on eq (II-2-2) in "非线性最小二乘问题的数值方法 —— 狗腿法 Powell‘s Dog Leg Method (I - 原理与算法)"
def L_model(h_vector, g_i, dg_i, sH_i):return g_i + multiply_matrix( dg_i.transpose(), h_vector) + 0.5 * multiply_matrix(multiply_matrix(h_vector.transpose(), sH_i), h_vector)def dog_leg_method(x_vector, epsilon_1, epsilon_2, epsilon_3, max_iter, trust_region_radius):# x_1 = x_vector[1]# # x_2 = x_vector[2]iter = 0delta = trust_region_radius   # trust-region radiusfound = False# g_i = g(x_vector)x_current_vector = x_vectorr_i = r(x_current_vector)dr_i = dr(x_current_vector)dg_i = multiply_matrix(np.transpose(dr_i), r_i)g_i = g(x_current_vector)# if np.max(np.abs(dg_i)) < epsilon_1:if (norm(r_i, np.inf) < epsilon_3 ) or (norm(dg_i, np.inf) < epsilon_1):found = Truearray_x_1 = []array_x_2 = []array_x_3 = []# x_new_vector = np.array([0,0])# g_new = np.infwhile (found == False) and (iter < max_iter):# sH_i = sH(x_vector)array_x_1.append(x_current_vector[0])array_x_2.append(x_current_vector[1])array_x_3.append(g_i)iter += 1step_sd_i = - dg_i Jh = multiply_matrix(dr_i, step_sd_i)alpha_i = pow(norm(step_sd_i, 2), 2) / pow(norm(Jh, 2), 2)step_cp_i = alpha_i * step_sd_i                     ## Steepest descent stepsH_i = multiply_matrix(np.transpose(dr_i), dr_i)    ## Simplified Hessian Matrixinv_sH_i =  inv(sH_i)step_gn_i = - np.array(multiply_matrix(inv_sH_i, dg_i))       ## Gauss-Newton steprho = -1while (rho < 0) and (found == False):               ## Until step acceptableif norm(step_gn_i, 2) < delta:                  ## Case Istep_dl_i = step_gn_iprint("Iterating index [%d], Case I"%iter)elif norm(step_cp_i, 2) >= delta:               ## Case II step_dl_i = (delta / norm(step_sd_i, 2)) * step_sd_iprint("Iterating index [%d], Case II"%iter)else:                                           ## Case IIIstep_gn_cp_i = step_gn_i - step_cp_ign_cp_norm_sq = pow(norm(step_gn_cp_i, 2),2)delta_cp_sq = pow(delta,2) - pow(norm(step_cp_i, 2), 2)c_matrix = multiply_matrix( np.transpose(step_cp_i), step_gn_cp_i )c = c_matrix[0][0]sqrt_discriminant = sqrt( pow(c,2) + gn_cp_norm_sq * delta_cp_sq ) if (c <= 0):beta = (-c + sqrt_discriminant) / gn_cp_norm_sqelse:beta = delta_cp_sq / (c + sqrt_discriminant)step_dl_i = step_cp_i + beta * step_gn_cp_iprint("Iterating index [%d], Case III"%iter)norm_step_dl = norm(step_dl_i, 2)if (norm_step_dl <= epsilon_2 * (norm(x_current_vector, 2) + epsilon_2)):found = Trueelse:# print(x_current_vector.shape)# print(step_dl_i.shape)x_new_vector = x_current_vector + step_dl_i.flatten()g_new = g(x_new_vector)L_0 = g_iL_h = L_model(step_dl_i, g_i, dg_i, sH_i)rho = (g_i - g_new) / (L_0 - L_h)if (rho > 0):                           ## Step acceptablex_current_vector = x_new_vector     ## New iterating stater_i = r(x_current_vector)dr_i = dr(x_current_vector)dg_i = multiply_matrix(np.transpose(dr_i), r_i)g_i = g(x_current_vector)if (norm(r_i, np.inf) < epsilon_3 ) or (norm(dg_i, np.inf) < epsilon_1):found = Trueif (rho > 0.75):                        ## Expanding trust regionif (delta - 3 * norm_step_dl < 0):delta = 3 * norm_step_dlelif (rho < 0.25):                      ## Shrinking trust regiondelta = delta / 2if (delta < (epsilon_2*(norm(x_current_vector, 2)+epsilon_2))):found = Truereturn array_x_1, array_x_2, array_x_3def result_plot(trajectory, trust_region_radius):fig = plt.figure()ax3 = plt.axes(projection='3d')xx = np.arange(-5,5,0.1)yy = np.arange(-4,4,0.1)X, Y = np.meshgrid(xx, yy)Z = np.zeros((X.shape[0], Y.shape[1]), dtype = float)for i in range(X.shape[0]):for j in range(Y.shape[1]):Z[i,j] = g(np.array([X[0,j], Y[i,0]]))ax3.plot_surface(X, Y, Z, rstride = 1, cstride = 1, cmap='rainbow', alpha=0.25)ax3.contour(X, Y, Z, offset=-1, cmap = 'rainbow')ax3.plot(trajectory[0], trajectory[1], trajectory[2], "r--")offset_data = -1*np.ones(len(trajectory[0]))ax3.plot(trajectory[0], trajectory[1], offset_data,'k--')ax3.set_title('Dog Leg Method \n(Initial point [%.1f, %.1f], Trust-region radius %.2f)' %(trajectory[0][0], trajectory[1][0], trust_region_radius))ax3.set_xlabel("r_1")ax3.set_ylabel("r_2")ax3.set_zlabel("g")file_name_prefix = "./dog_leg"file_extension = ".png"radius= "-r"file_name = f"{file_name_prefix}_{trajectory[0][0]}_{trajectory[1][0]}{radius}{trust_region_radius}{file_extension}"print(file_name)plt.draw()plt.savefig(file_name)if __name__ == "__main__":test_data = np.array([[4.9, 3.9], [-2.9, 1.9], [0.1, -0.1], [-0.1, 0.1], [0,-3.8],[1,2.5]], dtype=object)trust_region_radius = np.array([0.4, 0.01, 2.0])for radius in trust_region_radius:for inital_data in test_data:print("\nInitial point: [%.1f, %.1f]" %(inital_data[0],inital_data[1]))print("Trust region radius: %.2f" %radius)epsilon_1 = 1e-6epsilon_2 = 1e-6epsilon_3 = 1e-6max_iter = 1000trajectory = dog_leg_method(inital_data, epsilon_1, epsilon_2, epsilon_3, max_iter, radius)result_plot(trajectory, radius)

3. 测试结果

A. 结果显示

测试显示 (初始点 [4.9, 3.9], 信赖域半径分别为 0.01、0.4、2.0)测试显示 (初始点 [-2.9, 1.9], 信赖域半径分别为 0.01、0.4、2.0)
Levenberg_Marquardt_4.9_3.9Levenberg_Marquardt_4.9_3.9
Levenberg_Marquardt_0.1_-0.1Levenberg_Marquardt_-0.1_0.1
Levenberg_Marquardt_0_-3.8Levenberg_Marquardt_1_2.5

B. 迭代步说明

不同信赖域的迭代步及每一步的类型如下表所示. Case III 代表该步类型为狗腿步, Case II 代表该步类型为柯西步, Case I 代表该步类型为高斯-牛顿步.

越靠近收敛极小值点, 高斯-牛顿步类型出现频率越高, 这样有利于快速收敛.

信赖域半径对计算性能也有影响.

Trust-region radius = 0.01Trust-region radius = 0.4Trust-region radius = 2.0
Initial point: [4.9, 3.9]
Trust region radius: 0.01
Iterating index [1], Case II
Iterating index [2], Case II
Iterating index [3], Case II
Iterating index [4], Case II
Iterating index [5], Case II
Iterating index [6], Case II
Iterating index [7], Case I
Iterating index [8], Case I
Iterating index [9], Case I
Iterating index [10], Case I
Iterating index [11], Case I
Iterating index [12], Case I
Iterating index [13], Case I
Iterating index [14], Case I
Iterating index [15], Case I
Iterating index [16], Case I
Iterating index [17], Case I
Iterating index [18], Case I
Iterating index [19], Case I
Iterating index [20], Case I
Iterating index [21], Case I
Iterating index [22], Case I
Iterating index [23], Case I
Iterating index [24], Case I
Iterating index [25], Case I
Initial point: [4.9, 3.9]
Trust region radius: 0.40
Iterating index [1], Case II
Iterating index [2], Case II
Iterating index [3], Case III
Iterating index [4], Case III
Iterating index [4], Case III
Iterating index [5], Case I
Iterating index [5], Case I
Iterating index [5], Case III
Iterating index [6], Case II
Iterating index [7], Case I
Iterating index [8], Case I
Iterating index [9], Case I
Iterating index [10], Case I
Iterating index [11], Case I
Iterating index [12], Case I
Iterating index [13], Case I
Iterating index [14], Case I
Iterating index [15], Case I
Iterating index [16], Case I
Iterating index [17], Case I
Iterating index [18], Case I
Iterating index [19], Case I
Iterating index [20], Case I
Iterating index [21], Case I
Iterating index [22], Case I
Iterating index [23], Case I
Initial point: [4.9, 3.9]
Trust region radius: 2.00
Iterating index [1], Case II
Iterating index [2], Case I
Iterating index [3], Case I
Iterating index [3], Case I
Iterating index [3], Case III
Iterating index [4], Case I
Iterating index [5], Case I
Iterating index [6], Case I
Iterating index [7], Case I
Iterating index [8], Case I
Iterating index [9], Case I
Iterating index [10], Case I
Iterating index [11], Case I
Iterating index [12], Case I
Iterating index [13], Case I
Iterating index [14], Case I
Iterating index [15], Case I
Iterating index [16], Case I
Iterating index [17], Case I
Iterating index [18], Case I
Iterating index [19], Case I
Iterating index [20], Case I
Iterating index [21], Case I
Iterating index [22], Case I
Initial point: [-2.9, 1.9]
Trust region radius: 0.01
Iterating index [1], Case II
Iterating index [2], Case II
Iterating index [3], Case II
Iterating index [4], Case II
Iterating index [5], Case I
Iterating index [6], Case I
Iterating index [7], Case I
Iterating index [8], Case III
Iterating index [8], Case III
Iterating index [9], Case I
Iterating index [10], Case I
Iterating index [11], Case I
Iterating index [12], Case I
Iterating index [13], Case I
Iterating index [14], Case I
Iterating index [15], Case I
Iterating index [16], Case I
Iterating index [17], Case I
Iterating index [18], Case I
Iterating index [19], Case I
Iterating index [20], Case I
Iterating index [21], Case I
Iterating index [22], Case I
Iterating index [23], Case I
Iterating index [24], Case I
Iterating index [25], Case I
Iterating index [26], Case I
Initial point: [-2.9, 1.9]
Trust region radius: 0.40
Iterating index [1], Case II
Iterating index [2], Case I
Iterating index [3], Case I
Iterating index [4], Case I
Iterating index [5], Case III
Iterating index [5], Case III
Iterating index [6], Case I
Iterating index [7], Case I
Iterating index [8], Case I
Iterating index [9], Case I
Iterating index [10], Case I
Iterating index [11], Case I
Iterating index [12], Case I
Iterating index [13], Case I
Iterating index [14], Case I
Iterating index [15], Case I
Iterating index [16], Case I
Iterating index [17], Case I
Iterating index [18], Case I
Iterating index [19], Case I
Iterating index [20], Case I
Iterating index [21], Case I
Iterating index [22], Case I
Iterating index [23], Case I
Initial point: [-2.9, 1.9]
Trust region radius: 2.00
Iterating index [1], Case I
Iterating index [2], Case I
Iterating index [3], Case III
Iterating index [4], Case III
Iterating index [4], Case III
Iterating index [5], Case I
Iterating index [6], Case I
Iterating index [7], Case I
Iterating index [8], Case I
Iterating index [9], Case I
Iterating index [10], Case I
Iterating index [11], Case I
Iterating index [12], Case I
Iterating index [13], Case I
Iterating index [14], Case I
Iterating index [15], Case I
Iterating index [16], Case I
Iterating index [17], Case I
Iterating index [18], Case I
Iterating index [19], Case I
Iterating index [20], Case I
Iterating index [21], Case I
Iterating index [22], Case I

4. 结论

以上仅为一个狗腿法的简单示例, 推导和算法请见 “非线性最小二乘问题的数值方法 —— 狗腿法 Powell‘s Dog Leg Method (I - 原理与算法)”.

如有问题请指出, 谢谢!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.hqwc.cn/news/422976.html

如若内容造成侵权/违法违规/事实不符,请联系编程知识网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

IP地址组成

一、简介 ​ IP地址由四段组成&#xff0c;每个字段是一个字节&#xff0c;即4个字节、 每个字节有8位&#xff0c;最大值是255(256&#xff1a;0~255)&#xff0c;是全世界范围是唯一的 32 位&#xff08;4个字节 * 8位&#xff09;的标识符。 ​ IP地址由两部分组成&#x…

HackTheBox - Medium - Linux - BackendTwo

BackendTwo BackendTwo在脆弱的web api上通过任意文件读取、热重载的uvicorn从而访问目标&#xff0c;之后再通过猜单词小游戏获得root 外部信息收集 端口扫描 循例nmap Web枚举 feroxbuster扫目录 /api/v1列举了两个节点 /api/v1/user/1 扫user可以继续发现login和singup 注…

【多校联合主办】第二届艺术设计与数字化技术国际学术会议( ADDT 2023)

第二届艺术设计与数字化技术国际学术会议( ADDT 2023&#xff09; 2023 2nd International Conference on Art Design and Digital Technology&#xff08;ADDT 2023&#xff09; 所谓艺术设计&#xff0c;就是将艺术的审美感应用到与日常生活密切相关的设计中&#xff0c;使…

苹果电脑(Mac)的node版本安装以及升降级

在开发过程中&#xff0c;对于不同的开发环境或者较老的项目可能需要切换不同的node版本&#xff0c;此过程会涉及到node版本的升级与降级&#xff0c;安装node版本管理模块n&#xff08;sudo命令&#xff09;。 全局安装n模块 sudo npm install n -g//输入后回车&#xff0c…

限价单和止损单是什么?澳福实例讲解

什么是限价单和止损单&#xff0c;投资者如何使用它?了解交易基础知识&#xff0c;快速进入市场盈利收场&#xff0c;今天fpmarkets澳福和各位投资者继续探讨交易基础知识。 止损单是在看涨趋势中以更高的价格买入的前提&#xff0c;通过图表得知&#xff0c;黄线显示欧元兑美…

Spring MVC——Spring MVC(2)

1.参数接收 Springmvc中&#xff0c;接收页面提交的数据是通过方法形参来接收&#xff1a; 处理器适配器调用springmvc使用反射将前端提交的参数传递给controller方法的形参 springmvc接收的参数都是String类型&#xff0c;所以spirngmvc提供了很多converter&#xff08;转换…

28、web攻防——通用漏洞SQL注入HTTP头XFFCOOKIEPOST请求

文章目录 $_GET&#xff1a;接收get请求&#xff0c;传输少量数据&#xff0c;URL是有长度限制的&#xff1b; $_POST&#xff1a;接收post请求&#xff1b; $_COOKIE&#xff1a;接收cookie&#xff0c;用于身份验证&#xff1b; $_REQUEST&#xff1a;收集通过 GET 、POST和C…

C++入门学习(十二)字符串类型

上一节&#xff08;C入门学习&#xff08;十一&#xff09;字符型-CSDN博客&#xff09;中我们学到如何表示和使用一个字符串&#xff0c;本篇文章是字符串&#xff08;多个字符&#xff09;。 定义字符串主要有两种方式&#xff1a; 第一种&#xff1a; char str[] "…

基础算法(9):搜索(深搜)

目录 1.深度优先搜索 2.迷宫(经典深搜) 2.1 建图 2.2 深搜与回溯 2.3 完整代码 3.洛谷DFS 3.1 跳马 3.2 八皇后 3.2.1 建图 3.2.2 搜索 3.2.3 完整代码 3.3 Lake Counting(水坑计数) 今天主要来了解深度优先搜索(DFS)。 1.深度优先搜索 深搜的过程是从根节点进入…

【网站项目】基于SSM的263货物进销管理系统

&#x1f64a;作者简介&#xff1a;多年一线开发工作经验&#xff0c;分享技术代码帮助学生学习&#xff0c;独立完成自己的项目或者毕业设计。 代码可以私聊博主获取。&#x1f339;赠送计算机毕业设计600个选题excel文件&#xff0c;帮助大学选题。赠送开题报告模板&#xff…

《WebKit 技术内幕》学习之五(2): HTML解释器和DOM 模型

2.HTML 解释器 2.1 解释过程 HTML 解释器的工作就是将网络或者本地磁盘获取的 HTML 网页和资源从字节流解释成 DOM 树结构。 这一过程中&#xff0c;WebKit 内部对网页内容在各个阶段的结构表示。 WebKit 中这一过程如下&#xff1a;首先是字节流&#xff0c;经过解码之…

c/c++公交管理系统(星穹轨道可视化EasyX)

注&#xff1a;全代码由博主个人耗时两个星期开发&#xff0c;实现了基础的公交管理系统功能&#xff1a;1.前端用户查询站点以及查看所有站点。2.后端管理员权限&#xff0c;实现登录系统检验&#xff0c;添加路线以及删除路线。最重要的是使用EasyX实现了星穹轨道的启动以及抽…